pax_global_header00006660000000000000000000000064141645766330014531gustar00rootroot0000000000000052 comment=f18f2d9991da29533ad8e9859074c4f39185420f pgbackrest-release-2.37/000077500000000000000000000000001416457663300152275ustar00rootroot00000000000000pgbackrest-release-2.37/.cirrus.yml000066400000000000000000000052701416457663300173430ustar00rootroot00000000000000# Cirrus CI Build Definitions # ---------------------------------------------------------------------------------------------------------------------------------- # Build the branch if it is integration, a pull request, or ends in -ci/-cic (-cic targets only Cirrus CI) only_if: $CIRRUS_BRANCH == 'integration' || $CIRRUS_PR != '' || $CIRRUS_BRANCH =~ '.*-ci$' || $CIRRUS_BRANCH =~ '.*-cic$' # No auto-cancel on integration auto_cancellation: $CIRRUS_BRANCH != 'integration' # Arm64 # ---------------------------------------------------------------------------------------------------------------------------------- arm64_task: arm_container: image: ubuntu:20.04 cpu: 4 memory: 2G install_script: - apt-get update && apt-get install -y perl sudo locales - sed -i -e 's/# en_US.UTF-8 UTF-8/en_US.UTF-8 UTF-8/' /etc/locale.gen - dpkg-reconfigure --frontend=noninteractive locales - update-locale LANG=en_US.UTF-8 - adduser --disabled-password --gecos "" testuser - echo '%testuser ALL=(ALL) NOPASSWD: ALL' >> /etc/sudoers - chown -R testuser ${CIRRUS_WORKING_DIR?} script: - su - testuser -c "${CIRRUS_WORKING_DIR?}/test/ci.pl test --vm=none --sudo --no-tempfs --param=c-only --param=no-coverage" # FreeBSD 12 # ---------------------------------------------------------------------------------------------------------------------------------- freebsd_12_task: freebsd_instance: image_family: freebsd-12-2 cpu: 4 memory: 4G install_script: pkg install -y bash git postgresql-libpqxx pkgconf libxml2 gmake perl5 libyaml p5-YAML-LibYAML rsync script: - cd .. && perl ${CIRRUS_WORKING_DIR}/test/test.pl --no-gen --make-cmd=gmake --vm=none --vm-max=2 --no-coverage --no-valgrind --module=command --test=backup debug_script: - ls -lah ${CIRRUS_WORKING_DIR} # MacOS Catalina # ---------------------------------------------------------------------------------------------------------------------------------- macos_catalina_task: osx_instance: image: catalina-xcode environment: LDFLAGS: -L/usr/local/opt/openssl@1.1/lib -L/usr/local/opt/libpq/lib -L/usr/local/opt/libxml2/lib -L/usr/local/opt/libyaml/lib CPPFLAGS: -I/usr/local/opt/openssl@1.1/include -I/usr/local/opt/libpq/include -I/usr/local/opt/libxml2/include/libxml2 -I/usr/local/opt/libyaml/include PERL5LIB: /usr/local/opt/perl5/lib/perl5 install_script: - brew install -q openssl@1.1 libpq libxml2 libyaml cpanm - cpanm --local-lib=/usr/local/opt/perl5 install YAML::XS script: - cd .. && ${CIRRUS_WORKING_DIR}/test/test.pl --no-gen --vm=none --vm-max=2 --no-coverage --no-valgrind --module=command --test=backup debug_script: - ls -lah ${CIRRUS_WORKING_DIR} pgbackrest-release-2.37/.editorconfig000066400000000000000000000003451416457663300177060ustar00rootroot00000000000000root = true [*] indent_style = space indent_size = 4 end_of_line = lf charset = utf-8 trim_trailing_whitespace = true insert_final_newline = true [*.{yaml,yml}] indent_size = 2 [Makefile.in] indent_style = tab indent_size = 4 pgbackrest-release-2.37/.gitattributes000066400000000000000000000000651416457663300201230ustar00rootroot00000000000000# Classify all .h files as C *.h linguist-language=C pgbackrest-release-2.37/.github/000077500000000000000000000000001416457663300165675ustar00rootroot00000000000000pgbackrest-release-2.37/.github/ISSUE_TEMPLATE.md000066400000000000000000000015301416457663300212730ustar00rootroot00000000000000Please provide the following information when submitting an issue (feature requests or general comments can skip this): 1. pgBackRest version: 2. PostgreSQL version: 3. Operating system/version - if you have more than one server (for example, a database server, a repository host server, one or more standbys), please specify each: 4. Did you install pgBackRest from source or from a package? 5. Please attach the following as applicable: - `pgbackrest.conf` file(s) - `postgresql.conf` settings applicable to pgBackRest (`archive_command`, `archive_mode`, `listen_addresses`, `max_wal_senders`, `wal_level`, `port`) - errors in the postgresql log file before or during the time you experienced the issue - log file in `/var/log/pgbackrest` for the commands run (e.g. `/var/log/pgbackrest/mystanza_backup.log`) 7. Describe the issue: pgbackrest-release-2.37/.github/lock.yml000066400000000000000000000017471416457663300202530ustar00rootroot00000000000000# Configuration for Lock Threads - https://github.com/dessant/lock-threads # Number of days of inactivity before a closed issue or pull request is locked daysUntilLock: 90 # Skip issues and pull requests created before a given timestamp. Timestamp must # follow ISO 8601 (`YYYY-MM-DD`). Set to `false` to disable skipCreatedBefore: false # Issues and pull requests with these labels will be ignored. Set to `[]` to disable exemptLabels: [] # Label to add before locking, such as `outdated`. Set to `false` to disable lockLabel: false # Comment to post before locking. Set to `false` to disable lockComment: false # Assign `resolved` as the reason for locking. Set to `false` to disable setLockReason: false # Limit to only `issues` or `pulls` # only: issues # Optionally, specify configuration settings just for `issues` or `pulls` # issues: # exemptLabels: # - help-wanted # lockLabel: outdated # pulls: # daysUntilLock: 30 # Repository to extend settings from # _extends: repo pgbackrest-release-2.37/.github/pull_request_template.md000066400000000000000000000002311416457663300235240ustar00rootroot00000000000000Please read [Submitting a Pull Request](https://github.com/pgbackrest/pgbackrest/blob/main/CONTRIBUTING.md#submitting-a-pull-request) before submitting. pgbackrest-release-2.37/.github/workflows/000077500000000000000000000000001416457663300206245ustar00rootroot00000000000000pgbackrest-release-2.37/.github/workflows/test.yml000066400000000000000000000046361416457663300223370ustar00rootroot00000000000000name: test on: push: branches: - integration - '**-ci' - '**-cig' pull_request: branches: - integration - '**-ci' - '**-cig' jobs: test: runs-on: ubuntu-18.04 strategy: # Let all the jobs run to completion even if one fails fail-fast: false # The first jobs should be the canaries in the coal mine, i.e. the most likely to fail if there are problems in the code. They # should also be a good mix of unit, integration, and documentation tests. # # In general tests should be ordered from slowest to fastest. This does not make a difference for testing a single commit, but # when multiple commits are being tested it is best to have the slowest jobs first so that as jobs become available they will # tackle the slowest tests first. matrix: include: # All unit (without coverage) and integration tests for 32-bit - param: test --vm=d9 --param=no-performance # Debian/Ubuntu documentation - param: doc --vm=u18 # All integration tests - param: test --vm=u20 --param=build-package --param=module=mock --param=module=real # All unit tests (with coverage) on the newest gcc available - param: test --vm=f33 --param=c-only --param=tz=America/New_York # RHEL documentation - param: doc --vm=rh8 # All integration tests - param: test --vm=rh7 --param=module=mock --param=module=real steps: - name: Checkout Code uses: actions/checkout@v2 with: path: pgbackrest - name: Run Test run: cd ${HOME?} && ${GITHUB_WORKSPACE?}/pgbackrest/test/ci.pl ${{matrix.param}} --param=build-max=2 codeql: runs-on: ubuntu-latest permissions: actions: read contents: read security-events: write strategy: fail-fast: false matrix: language: - cpp steps: - name: Checkout Code uses: actions/checkout@v2 - name: Install Packages run: sudo apt-get install -y --no-install-recommends libyaml-dev - name: Initialize CodeQL uses: github/codeql-action/init@v1 with: languages: ${{matrix.language}} - name: Build run: ${GITHUB_WORKSPACE?}/src/configure && make -j 2 - name: Perform CodeQL Analysis uses: github/codeql-action/analyze@v1 pgbackrest-release-2.37/.gitignore000066400000000000000000000000311416457663300172110ustar00rootroot00000000000000**/*~ *~ *.swp .DS_Store pgbackrest-release-2.37/.travis.yml000066400000000000000000000016431416457663300173440ustar00rootroot00000000000000branches: only: - integration - /-ci$/ - /-cit$/ os: linux dist: bionic language: c jobs: # Run unit tests that provide wide coverage on multiple architectures. include: # Valgrind is disabled due to some platform-specific issues in getpwuid() and getgrgid() that do not seem to be pgBackRest bugs. - arch: ppc64le env: - PGB_CI="test --vm=none --param=no-coverage --param=module=command --param=module=storage --param=no-valgrind" services: - arch: arm64 env: - PGB_CI="test --vm=none --param=no-coverage --param=module=command --param=module=storage" services: - arch: s390x env: - PGB_CI="test --vm=none --param=no-coverage --param=module=command --param=module=storage" services: install: - umask 0022 && cd ~ && pwd && whoami && umask && groups - df -Th && top -bn1 script: - ${TRAVIS_BUILD_DIR?}/test/ci.pl ${PGB_CI?} pgbackrest-release-2.37/CODING.md000066400000000000000000000242061416457663300165200ustar00rootroot00000000000000# pgBackRest
Coding Standards ## Standards ### Indentation Indentation is four spaces -- no tabs. Only file types that absolutely require tabs (e.g. `Makefile`) may use them. ### Line Length With the exception of documentation code, no line of any code or test file shall exceed 132 characters. If a line break is required, then it shall be after the first function parenthesis: ``` // CORRECT - location of line break after first function parenthesis if line length is greater than 132 StringList *removeList = infoBackupDataLabelList( infoBackup, strNewFmt("^%s.*", strZ(strLstGet(currentBackupList, fullIdx)))); // INCORRECT StringList *removeList = infoBackupDataLabelList(infoBackup, strNewFmt("^%s.*", strZ(strLstGet(currentBackupList, fullIdx)))); ``` If a conditional, then after a completed conditional, for example: ``` // CORRECT - location of line break after a completed conditional if line length is greater than 132 if (archiveInfoPgHistory.id != backupInfoPgHistory.id || archiveInfoPgHistory.systemId != backupInfoPgHistory.systemId || archiveInfoPgHistory.version != backupInfoPgHistory.version) // INCORRECT if (archiveInfoPgHistory.id != backupInfoPgHistory.id || archiveInfoPgHistory.systemId != backupInfoPgHistory.systemId || archiveInfoPgHistory.version != backupInfoPgHistory.version) ``` ### Function Comments Comments for `extern` functions should be included in the `.h` file. Comments for `static` functions and implementation-specific notes for `extern` functions (i.e., not of interest to the general user) should be included in the `.c` file. ### Inline Comment Inline comments shall start at character 69 and must not exceed the line length of 132. For example: ``` typedef struct InlineCommentExample { const String *comment; // Inline comment example const String *longComment; // Inline comment example that exceeds 132 characters should // then go to next line but this should be avoided } InlineCommentExample; ``` ### Naming #### Variables Variable names use camel case with the first letter lower-case. - `stanzaName` - the name of the stanza - `nameIdx` - loop variable for iterating through a list of names Variable names should be descriptive. Avoid `i`, `j`, etc. #### Types Type names use camel case with the first letter upper case: `typedef struct MemContext <...>` `typedef enum {<...>} ErrorState;` #### Constants **#define Constants** `#define` constants should be all caps with `_` separators. ```c #define MY_CONSTANT "STRING" ``` The value should be aligned at column 69 whenever possible. This type of constant should mostly be used for strings. Use enums whenever possible for integer constants. **String Constants** String constants can be declared using the `STRING_STATIC()` macro for local strings and `STRING_EXTERN()` for strings that will be externed for use in other modules. Externed strings should be declared in the header file as: ```c #define SAMPLE_VALUE "STRING" STRING_DECLARE(SAMPLE_VALUE_STR); ``` And in the C file as: ```c STRING_EXTERN(SAMPLE_VALUE_STR, SAMPLE_VALUE); ``` Static strings declared in the C file are not required to have a `#define` if the `#define` version is not used. Externed strings must always have the `#define` in the header file. **Enum Constants** Enum elements follow the same case rules as variables. They are strongly typed so this shouldn't present any confusion. ```c typedef enum { cipherModeEncrypt, cipherModeDecrypt, } CipherMode; ``` Note the comma after the last element. This reduces diff churn when new elements are added. #### Macros Macro names should be upper-case with underscores between words. Macros (except simple constants) should be avoided whenever possible as they make code less clear and test coverage harder to measure. Macros should follow the format: ```c #define MACRO(paramName1, paramName2) \ ``` If the macro defines a block it should look like: ```c #define MACRO_2(paramName1, paramName2) \ { \ \ } ``` Continuation characters should be aligned at column 132 (unlike the examples above that have been shortened for display purposes). To avoid conflicts, variables in a macro will be named `[macro name]_[var name]`, e.g. `TEST_RESULT_resultExpected`. Variables that need to be accessed in wrapped code should be provided accessor macros. [Variadic functions](#variadic-functions) are an exception to the capitalization rule. #### Begin / End Use `Begin` / `End` for names rather than `Start` / `Finish`, etc. #### New / Free Use `New` / `Free` for constructors and destructors rather than `Create` / `Destroy`, etc. ### Formatting #### Braces C allows braces to be excluded for a single statement. However, braces should be used when the control statement (if, while, etc.) spans more than one line or the statement to be executed spans more than one line. No braces needed: ```c if (condition) return value; ``` Braces needed: ```c if (conditionThatUsesEntireLine1 && conditionThatUsesEntireLine2) { return value; } ``` ```c if (condition) { return valueThatUsesEntireLine1 && valueThatUsesEntireLine2; } ``` Braces should be added to `switch` statement cases that have a significant amount of code. As a general rule of thumb, if the code block in the `case` is large enough to have blank lines and/or multiple comments then it should be enclosed in braces. ```c switch (int) { case 1: a = 2; break; case 2: { # Comment this more complex code a = 1; b = 2; c = func(a, b); break; } } ``` #### Hints, Warnings, and Errors Hints are to be formatted with capitalized `HINT:` followed by a space and a sentence. The sentence shall only begin with a capital letter if the first word is an acronym (e.g. TLS) or a proper name (e.g. PostgreSQL). The sentence must end with a period, question mark or exclamation point as appropriate. Warning and errors shall be lowercase with the exceptions for proper names and acronyms and end without punctuation. ## Language Elements ### Data Types Don't get exotic - use the simplest type that will work. Use `int` or `unsigned int` for general cases. `int` will be at least 32 bits. When not using `int` use one of the types defined in `common/type.h`. ### Macros Don't use a macro when a function could be used instead. Macros make it hard to measure code coverage. ### Objects Object-oriented programming is used extensively. The object pointer is always referred to as `this`. An object can expose internal struct members by defining a public struct that contains the members to be exposed and using inline functions to get/set the members. The header file: ```c /*********************************************************************************************************************************** Getters/setters ***********************************************************************************************************************************/ typedef struct ListPub { unsigned int listSize; // List size } ListPub; // List size __attribute__((always_inline)) static inline unsigned int lstSize(const List *const this) { return THIS_PUB(List)->listSize; } ``` `THIS_PUB()` ensures that `this != NULL` so there is no need to check that in the calling function. And the C file: ```c struct List { ListPub pub; // Publicly accessible variables ... }; ``` The public struct must be the first member of the private struct. The naming convention for the public struct is to add `Pub` to the end of the private struct name. ### Variadic Functions Variadic functions can take a variable number of parameters. While the `printf()` pattern is variadic, it is not very flexible in terms of optional parameters given in any order. This project implements variadic functions using macros (which are exempt from the normal macro rule of being all caps). A typical variadic function definition: ```c typedef struct StoragePathCreateParam { bool errorOnExists; bool noParentCreate; mode_t mode; } StoragePathCreateParam; #define storagePathCreateP(this, pathExp, ...) \ storagePathCreate(this, pathExp, (StoragePathCreateParam){__VA_ARGS__}) #define storagePathCreateP(this, pathExp) \ storagePathCreate(this, pathExp, (StoragePathCreateParam){0}) void storagePathCreate(const Storage *this, const String *pathExp, StoragePathCreateParam param); ``` Continuation characters should be aligned at column 132 (unlike the example above that has been shortened for display purposes). This function can be called without variable parameters: ```c storagePathCreateP(storageLocal(), "/tmp/pgbackrest"); ``` Or with variable parameters: ```c storagePathCreateP(storageLocal(), "/tmp/pgbackrest", .errorOnExists = true, .mode = 0777); ``` If the majority of functions in a module or object are variadic it is best to provide macros for all functions even if they do not have variable parameters. Do not use the base function when variadic macros exist. ## Testing ### Uncoverable/Uncovered Code #### Uncoverable Code The `uncoverable` keyword marks code that can never be covered. For instance, a function that never returns because it always throws an error. Uncoverable code should be rare to non-existent outside the common libraries and test code. ```c } // {uncoverable - function throws error so never returns} ``` Subsequent code that is uncoverable for the same reason is marked with `// {+uncoverable}`. #### Uncovered Code Marks code that is not tested for one reason or another. This should be kept to a minimum and an excuse given for each instance. ```c exit(EXIT_FAILURE); // {uncovered - test harness does not support non-zero exit} ``` Subsequent code that is uncovered for the same reason is marked with `// {+uncovered}`. pgbackrest-release-2.37/CONTRIBUTING.md000066400000000000000000001114361416457663300174660ustar00rootroot00000000000000# pgBackRest
Contributing to pgBackRest ## Table of Contents [Introduction](#introduction) [Building a Development Environment](#building-a-development-environment) [Coding](#coding) [Testing](#testing) [Submitting a Pull Request](#submitting-a-pull-request) ## Introduction This documentation is intended to assist contributors to pgBackRest by outlining some basic steps and guidelines for contributing to the project. Code fixes or new features can be submitted via pull requests. Ideas for new features and improvements to existing functionality or documentation can be [submitted as issues](https://github.com/pgbackrest/pgbackrest/issues). You may want to check the [Project Boards](https://github.com/pgbackrest/pgbackrest/projects) to see if your suggestion has already been submitted. Bug reports should be [submitted as issues](https://github.com/pgbackrest/pgbackrest/issues). Please provide as much information as possible to aid in determining the cause of the problem. You will always receive credit in the [release notes](http://www.pgbackrest.org/release.html) for your contributions. Coding standards are defined in [CODING.md](https://github.com/pgbackrest/pgbackrest/blob/main/CODING.md) and some important coding details and an example are provided in the [Coding](#coding) section below. At a minimum, unit tests must be written and run and the documentation generated before [submitting a Pull Request](#submitting-a-pull-request); see the [Testing](#testing) section below for details. ## Building a Development Environment This example is based on Ubuntu 20.04, but it should work on many versions of Debian and Ubuntu. pgbackrest-dev => Install development tools ``` sudo apt-get install rsync git devscripts build-essential valgrind lcov autoconf \ autoconf-archive libssl-dev zlib1g-dev libxml2-dev libpq-dev pkg-config \ libxml-checker-perl libyaml-perl libdbd-pg-perl liblz4-dev liblz4-tool \ zstd libzstd-dev bzip2 libbz2-dev libyaml-dev ``` Some unit tests and all the integration tests require Docker. Running in containers allows us to simulate multiple hosts, test on different distributions and versions of PostgreSQL, and use sudo without affecting the host system. pgbackrest-dev => Install Docker ``` curl -fsSL https://get.docker.com | sudo sh sudo usermod -aG docker `whoami` ``` This clone of the pgBackRest repository is sufficient for experimentation. For development, create a fork and clone that instead. pgbackrest-dev => Clone pgBackRest repository ``` git clone https://github.com/pgbackrest/pgbackrest.git ``` If using a RHEL-based system, the CPAN XML parser is required to run `test.pl` and `doc.pl`. Instructions for installing Docker and the XML parser can be found in the `README.md` file of the pgBackRest [doc](https://github.com/pgbackrest/pgbackrest/blob/main/doc) directory in the section "The following is a sample RHEL 7 configuration that can be used for building the documentation". NOTE that the "Install latex (for building PDF)" section is not required since testing of the docs need only be run for HTML output. ## Coding The following sections provide information on some important concepts needed for coding within pgBackRest. ### Memory Contexts Memory is allocated inside contexts and can be long lasting (for objects) or temporary (for functions). In general, use `OBJ_NEW_BEGIN(MyObj)` for objects and `MEM_CONTEXT_TEMP_BEGIN()` for functions. See [memContext.h](https://github.com/pgbackrest/pgbackrest/blob/main/src/common/memContext.h) for more details and the [Coding Example](#coding-example) below. ### Logging Logging is used for debugging with the built-in macros `FUNCTION_LOG_*()` and `FUNCTION_TEST_*()` which are used to trace parameters passed to/returned from functions. `FUNCTION_LOG_*()` macros are used for production logging whereas `FUNCTION_TEST_*()` macros will be compiled out of production code. For functions where no parameter is valuable enough to justify the cost of debugging in production, use `FUNCTION_TEST_BEGIN()/FUNCTION_TEST_END()`, else use `FUNCTION_LOG_BEGIN(someLogLevel)/FUNCTION_LOG_END()`. See [debug.h](https://github.com/pgbackrest/pgbackrest/blob/main/src/common/debug.h) for more details and the [Coding Example](#coding-example) below. Logging is also used for providing information to the user via the `LOG_*()` macros, such as `LOG_INFO("some informational message")` and `LOG_WARN_FMT("no prior backup exists, %s backup has been changed to full", strZ(cfgOptionDisplay(cfgOptType)))` and also via `THROW_*()` macros for throwing an error. See [log.h](https://github.com/pgbackrest/pgbackrest/blob/main/src/common/log.h) and [error.h](https://github.com/pgbackrest/pgbackrest/blob/main/src/common/error.h) for more details and the [Coding Example](#coding-example) below. ### Coding Example The example below is not structured like an actual implementation and is intended only to provide an understanding of some of the more common coding practices. The comments in the example are only here to explain the example and are not representative of the coding standards. Refer to the Coding Standards document ([CODING.md](https://github.com/pgbackrest/pgbackrest/blob/main/CODING.md)) and sections above for an introduction to the concepts provided here. For an actual implementation, see [db.h](https://github.com/pgbackrest/pgbackrest/blob/main/src/db/db.h) and [db.c](https://github.com/pgbackrest/pgbackrest/blob/main/src/db/db.c). #### Example: hypothetical basic object construction ```c /* * HEADER FILE - see db.h for a complete implementation example */ // Typedef the object declared in the C file typedef struct MyObj MyObj; // Constructor, and any functions in the header file, are all declared on one line MyObj *myObjNew(unsigned int myData, const String *secretName); // Declare the publicly accessible variables in a structure with Pub appended to the name typedef struct MyObjPub // First letter upper case { unsigned int myData; // Contents of the myData variable } MyObjPub; // Declare getters and setters inline for the publicly visible variables // Only setters require "Set" appended to the name __attribute__((always_inline)) static inline unsigned int myObjMyData(const MyObj *const this) { return THIS_PUB(MyObj)->myData; // Use the built-in THIS_PUB macro } // Destructor __attribute__((always_inline)) static inline void myObjFree(MyObj *const this) { objFree(this); } // TYPE and FORMAT macros for function logging #define FUNCTION_LOG_MY_OBJ_TYPE \ MyObj * #define FUNCTION_LOG_MY_OBJ_FORMAT(value, buffer, bufferSize) \ FUNCTION_LOG_STRING_OBJECT_FORMAT(value, myObjToLog, buffer, bufferSize) /* * C FILE - see db.c for a more complete and actual implementation example */ // Declare the object type struct MyObj { MyObjPub pub; // Publicly accessible variables must be first and named "pub" const String *name; // Pointer to lightweight string object - see string.h }; // Object constructor, and any functions in the C file, have the return type and function signature on separate lines MyObj * myObjNew(unsigned int myData, const String *secretName) { FUNCTION_LOG_BEGIN(logLevelDebug); // Use FUNCTION_LOG_BEGIN with a log level for displaying in production FUNCTION_LOG_PARAM(UINT, myData); // When log level is debug, myData variable will be logged FUNCTION_TEST_PARAM(STRING, secretName); // FUNCTION_TEST_PARAM will not display secretName value in production logging FUNCTION_LOG_END(); ASSERT(secretName != NULL || myData > 0); // Development-only assertions (will be compiled out of production code) MyObj *this = NULL; // Declare the object in the parent memory context: it will live only as long as the parent OBJ_NEW_BEGIN(MyObj) // Create a long lasting memory context with the name of the object { this = OBJ_NEW_ALLOC(); // Allocate the memory required by the object *this = (MyObj) // Initialize the object { .pub = { .myData = myData, // Copy the simple data type to this object }, .name = strDup(secretName), // Duplicate the String data type to the this object's memory context }; } OBJ_NEW_END(); FUNCTION_LOG_RETURN(MyObj, this); } // Function using temporary memory context String * myObjDisplay(unsigned int myData) { FUNCTION_TEST_BEGIN(); // No parameters passed to this function will be logged in production FUNCTION_TEST_PARAM(UINT, myData); FUNCTION_TEST_END(); String *result = NULL; // Result is created in the caller's memory context (referred to as "prior context" below) MEM_CONTEXT_TEMP_BEGIN() // Begin a new temporary context { String *resultStr = strNewZ("Hello"); // Allocate a string in the temporary memory context if (myData > 1) resultStr = strCatZ(" World"); // Append a value to the string still in the temporary memory context else LOG_WARN("Am I not your World?"); // Log a warning to the user MEM_CONTEXT_PRIOR_BEGIN() // Switch to the prior context so the string duplication is in the caller's context { result = strDup(resultStr); // Create a copy of the string in the caller's context } MEM_CONTEXT_PRIOR_END(); // Switch back to the temporary context } MEM_CONTEXT_TEMP_END(); // Free everything created inside this temporary memory context - i.e resultStr FUNCTION_TEST_RETURN(STRING, result); // Return result but do not log the value in production } // Create the logging function for displaying important information from the object String * myObjToLog(const MyObj *this) { return strNewFmt( "{name: %s, myData: %u}", this->name == NULL ? NULL_Z : strZ(this->name), myObjMyData(this)); } ``` ## Testing A list of all possible test combinations can be viewed by running: ``` pgbackrest/test/test.pl --dry-run ``` While some files are automatically generated during `make`, others are generated by running the test harness as follows: ``` pgbackrest/test/test.pl --gen-only ``` Prior to any submission, the html version of the documentation should also be run and the output checked by viewing the generated html on the local file system under `pgbackrest/doc/output/html`. More details can be found in the pgBackRest [doc/README.md](https://github.com/pgbackrest/pgbackrest/blob/main/doc/README.md) file. ``` pgbackrest/doc/doc.pl --out=html ``` > **NOTE:** `ERROR: [028]` regarding cache is invalid is OK; it just means there have been changes and the documentation will be built from scratch. In this case, be patient as the build could take 20 minutes or more depending on your system. ### Running Tests Examples of test runs are provided in the following sections. There are several important options for running a test: - `--dry-run` - without any other options, this will list all the available tests - `--module` - identifies the module in which the test is located - `--test` - the actual test set to be run - `--run` - a number identifying the run within a test if testing a single run rather than the entire test - `--vm-out` - displays the test output (helpful for monitoring the progress) - `--vm` - identifies the pre-built container when using Docker, otherwise the setting should be `none`. See [test.yml](https://github.com/pgbackrest/pgbackrest/blob/main/.github/workflows/test.yml) for a list of valid vm codes noted by `param: test`. For more options, run the test or documentation engine with the `--help` option: ``` pgbackrest/test/test.pl --help pgbackrest/doc/doc.pl --help ``` #### Without Docker If Docker is not installed, then the available tests can be listed using `--vm=none`, and each test must then be run with `--vm=none`. pgbackrest-dev => List tests that don't require a container ``` pgbackrest/test/test.pl --vm=none --dry-run --- output --- P00 INFO: test begin on x86_64 - log level info P00 INFO: configure build P00 INFO: builds required: bin --> P00 INFO: 74 tests selected P00 INFO: P1-T01/74 - vm=none, module=common, test=error [filtered 71 lines of output] P00 INFO: P1-T73/74 - vm=none, module=performance, test=type P00 INFO: P1-T74/74 - vm=none, module=performance, test=storage --> P00 INFO: DRY RUN COMPLETED SUCCESSFULLY ``` pgbackrest-dev => Run a test ``` pgbackrest/test/test.pl --vm=none --vm-out --module=common --test=wait --- output --- P00 INFO: test begin on x86_64 - log level info P00 INFO: autogenerate configure P00 INFO: autogenerated version in configure.ac script: no changes P00 INFO: autogenerated configure script: no changes P00 INFO: autogenerate code P00 INFO: cleanup old data P00 INFO: builds required: none P00 INFO: 1 test selected P00 INFO: P1-T1/1 - vm=none, module=common, test=wait run 1 - waitNew(), waitMore, and waitFree() L0018 expect AssertError: assertion 'waitTime <= 999999000' failed run 1/1 ------------- L0021 0ms wait L0025 new wait L0026 check remaining time L0027 check wait time L0028 check sleep time L0029 check sleep prev time L0030 no wait more L0033 new wait = 0.2 sec L0034 check remaining time L0035 check wait time L0036 check sleep time L0037 check sleep prev time L0038 check begin time L0044 lower range check L0045 upper range check L0047 free wait L0052 new wait = 1.1 sec L0053 check wait time L0054 check sleep time L0055 check sleep prev time L0056 check begin time L0062 lower range check L0063 upper range check L0065 free wait TESTS COMPLETED SUCCESSFULLY P00 INFO: P1-T1/1 - vm=none, module=common, test=wait P00 INFO: tested modules have full coverage P00 INFO: writing C coverage report P00 INFO: TESTS COMPLETED SUCCESSFULLY ``` An entire module can be run by using only the `--module` option. pgbackrest-dev => Run a module ``` pgbackrest/test/test.pl --vm=none --module=postgres --- output --- P00 INFO: test begin on x86_64 - log level info P00 INFO: autogenerate configure P00 INFO: autogenerated version in configure.ac script: no changes P00 INFO: autogenerated configure script: no changes P00 INFO: autogenerate code P00 INFO: cleanup old data P00 INFO: builds required: none P00 INFO: 2 tests selected P00 INFO: P1-T1/2 - vm=none, module=postgres, test=client P00 INFO: P1-T2/2 - vm=none, module=postgres, test=interface P00 INFO: tested modules have full coverage P00 INFO: writing C coverage report P00 INFO: TESTS COMPLETED SUCCESSFULLY ``` #### With Docker Build a container to run tests. The vm must be pre-configured but a variety are available. A vagrant file is provided in the test directory as an example of running in a virtual environment. The vm names are all three character abbreviations, e.g. `u20` for Ubuntu 20.04. pgbackrest-dev => Build a VM ``` pgbackrest/test/test.pl --vm-build --vm=u20 --- output --- P00 INFO: test begin on x86_64 - log level info P00 INFO: Using cached pgbackrest/test:u20-base-20210930A image (7ffb73ceb9a2e3aad2cba7eb5c8e28fc3982db18) ... P00 INFO: Building pgbackrest/test:u20-test image ... P00 INFO: Build Complete ``` > **NOTE:** to build all the vms, just omit the `--vm` option above. pgbackrest-dev => Run a Specific Test Run ``` pgbackrest/test/test.pl --vm=u20 --module=mock --test=archive --run=2 --- output --- P00 INFO: test begin on x86_64 - log level info P00 INFO: autogenerate configure P00 INFO: autogenerated version in configure.ac script: no changes P00 INFO: autogenerated configure script: no changes P00 INFO: autogenerate code P00 INFO: cleanup old data and containers P00 INFO: builds required: bin, bin host P00 INFO: build bin for u20 (/home/vagrant/test/bin/u20) P00 INFO: bin dependencies have changed, rebuilding P00 INFO: build bin for none (/home/vagrant/test/bin/none) P00 INFO: bin dependencies have changed, rebuilding P00 INFO: 1 test selected P00 INFO: P1-T1/1 - vm=u20, module=mock, test=archive, run=2 P00 INFO: no code modules had all tests run required for coverage P00 INFO: TESTS COMPLETED SUCCESSFULLY ``` ### Writing a Unit Test The goal of unit testing is to have 100 percent code coverage. Two files will usually be involved in this process: - **define.yaml** - defines the number of tests to be run for each module and test file. There is a comment at the top of the file that provides more information about this file. - **src/module/somefileTest.c** - where "somefile" is the path and name of the test file where the unit tests are located for the code being updated (e.g. `src/module/command/expireTest.c`). #### define.yaml Each module is separated by a line of asterisks (*) and each test within is separated by a line of dashes (-). In the example below, the module is `command` and the unit test is `check`. The number of calls to `testBegin()` in a unit test file will dictate the number following `total:`, in this case 4. Under `coverage:`, the list of files that will be tested. ``` # ******************************************************************************************************************************** - name: command test: # ---------------------------------------------------------------------------------------------------------------------------- - name: check total: 4 containerReq: true coverage: - command/check/common - command/check/check ``` #### somefileTest.c Unit test files are organized in the `test/src/module` directory with the same directory structure as the source code being tested. For example, if new code is added to src/**command/expire**.c then test/src/module/**command/expire**Test.c will need to be updated. Assuming that a test file already exists, new unit tests will either go in a new `testBegin()` section or be added to an existing section. Each such section is a test run. The comment string passed to `testBegin()` should reflect the function(s) being tested in the test run. Tests within a run should use `TEST_TITLE()` with a comment string describing the test. ``` // ***************************************************************************************************************************** if (testBegin("expireBackup()")) { // ------------------------------------------------------------------------------------------------------------------------- TEST_TITLE("manifest file removal"); ``` #### Setting up the command to be run The [harnessConfig.h](https://github.com/pgbackrest/pgbackrest/blob/main/test/src/common/harnessConfig.h) describes a list of functions that should be used when configuration options are required for a command being tested. Options are set in a `StringList` which must be defined and passed to the `HRN_CFG_LOAD()` macro with the command. For example, the following will set up a test to run `pgbackrest --repo-path=test/test-0/repo info` command on multiple repositories, one of which is encrypted: ``` StringList *argList = strLstNew(); // Create an empty string list hrnCfgArgRawZ(argList, cfgOptRepoPath, TEST_PATH "/repo"); // Add the --repo-path option hrnCfgArgKeyRawZ(argList, cfgOptRepoPath, 2, TEST_PATH "/repo2"); // Add the --repo2-path option hrnCfgArgKeyRawStrId(argList, cfgOptRepoCipherType, 2, cipherTypeAes256Cbc); // Add the --repo2-cipher-type option hrnCfgEnvKeyRawZ(cfgOptRepoCipherPass, 2, TEST_CIPHER_PASS); // Set environment variable for the --repo2-cipher-pass option HRN_CFG_LOAD(cfgCmdInfo, argList); // Load the command and option list into the test harness ``` #### Storing a file Sometimes it is desirable to store or manipulate files before or during a test and then confirm the contents. The [harnessStorage.h](https://github.com/pgbackrest/pgbackrest/blob/main/test/src/common/harnessStorage.h) file contains macros (e.g. `HRN_STORAGE_PUT` and `TEST_STORAGE_GET`) for doing this. In addition, `HRN_INFO_PUT` is convenient for writing out info files (archive.info, backup.info, backup.manifest) since it will automatically add header and checksum information. ``` HRN_STORAGE_PUT_EMPTY( storageRepoWrite(), STORAGE_REPO_ARCHIVE "/10-1/000000010000000100000001-abcdabcdabcdabcdabcdabcdabcdabcdabcdabcd.gz"); ``` #### Testing results Tests are run and results confirmed via macros that are described in [harnessTest.h](https://github.com/pgbackrest/pgbackrest/blob/main/test/src/common/harnessTest.h). With the exception of TEST_ERROR, the third parameter is a short description of the test. Some of the more common macros are: - `TEST_RESULT_STR` - Test the actual value of the string returned by the function. - `TEST_RESULT_UINT` / `TEST_RESULT_INT` - Test for an unsigned integer / integer. - `TEST_RESULT_BOOL` - Test a boolean value. - `TEST_RESULT_PTR` / `TEST_RESULT_PTR_NE` - Test a pointer: useful for testing if the pointer is `NULL` or not equal (`NE`) to `NULL`. - `TEST_RESULT_VOID` - The function being tested returns a `void`. This is then usually followed by tests that ensure other actions occurred (e.g. a file was written to disk). - `TEST_ERROR` / `TEST_ERROR_FMT` - Test that a specific error code was raised with specific wording. > **NOTE:** `HRN_*` macros should be used only for test setup and cleanup. `TEST_*` macros must be used for testing results. #### Testing a log message If a function being tested logs something with `LOG_WARN`, `LOG_INFO` or other `LOG_*()` macro, then the logged message must be cleared before the end of the test by using the `TEST_RESULT_LOG()/TEST_RESULT_LOG_FMT()` macros. ``` TEST_RESULT_LOG( "P00 WARN: WAL segment '000000010000000100000001' was not pushed due to error [25] and was manually skipped: error"); ``` In the above, `Pxx` indicates the process (P) and the process number (xx), e.g. P00, P01. #### Testing using child process Sometimes it is useful to use a child process for testing. Below is a simple example. See [harnessFork.h](https://github.com/pgbackrest/pgbackrest/blob/main/test/src/common/harnessFork.h) for more details. ``` HRN_FORK_BEGIN() { HRN_FORK_CHILD_BEGIN() { TEST_RESULT_INT_NE( lockAcquire(cfgOptionStr(cfgOptLockPath), STRDEF("stanza1"), STRDEF("999-ffffffff"), lockTypeBackup, 0, true), -1, "create backup/expire lock"); // Notify parent that lock has been acquired HRN_FORK_CHILD_NOTIFY_PUT(); // Wait for parent to allow release lock HRN_FORK_CHILD_NOTIFY_GET(); lockRelease(true); } HRN_FORK_CHILD_END(); HRN_FORK_PARENT_BEGIN() { // Wait for child to acquire lock HRN_FORK_PARENT_NOTIFY_GET(0); HRN_CFG_LOAD(cfgCmdInfo, argListText); TEST_RESULT_STR_Z( infoRender(), "stanza: stanza1\n" " status: error (no valid backups, backup/expire running)\n" " cipher: none\n" "\n" " db (current)\n" " wal archive min/max (9.4): none present\n", "text - single stanza, no valid backups, backup/expire lock detected"); // Notify child to release lock HRN_FORK_PARENT_NOTIFY_PUT(0); } HRN_FORK_PARENT_END(); } HRN_FORK_END(); ``` #### Testing using a shim A PostgreSQL libpq shim is provided to simulate interactions with PostgreSQL. Below is a simple example. See [harnessPq.h](https://github.com/pgbackrest/pgbackrest/blob/main/test/src/common/harnessPq.h) for more details. ``` // Set up two standbys but no primary harnessPqScriptSet((HarnessPq []) { HRNPQ_MACRO_OPEN_GE_92(1, "dbname='postgres' port=5432", PG_VERSION_92, "/pgdata", true, NULL, NULL), HRNPQ_MACRO_OPEN_GE_92(8, "dbname='postgres' port=5433", PG_VERSION_92, "/pgdata", true, NULL, NULL), // Close the "inner" session first (8) then the outer (1) HRNPQ_MACRO_CLOSE(8), HRNPQ_MACRO_CLOSE(1), HRNPQ_MACRO_DONE() }); TEST_ERROR(cmdCheck(), ConfigError, "primary database not found\nHINT: check indexed pg-path/pg-host configurations"); ``` ### Running a Unit Test **Code Coverage** Unit tests are run for all files that are listed in `define.yaml` and a coverage report generated for each file listed under the tag `coverage:`. Note that some files are listed in multiple `coverage:` sections for a module; in this case, each test for the file being modified should be specified for the module in which the file exists (e.g. `--module=storage --test=posix --test=gcs`, etc.) or, alternatively, simply run the module without the `--test` option. It is recommended that a `--vm` be specified since running the same test for multiple vms is unnecessary for coverage. The following example would run the test set from the **define.yaml** section detailed above. ``` pgbackrest/test/test.pl --vm-out --module=command --test=check --vm=u20 ``` > **NOTE:** Not all systems perform at the same speed, so if a test is timing out, try rerunning with another vm. Because a test run has not been specified, a coverage report will be generated and written to the local file system under the pgBackRest directory `test/result/coverage/lcov/index.html` and a file with only the highlighted code that has not been covered will be written to `test/result/coverage/coverage.html`. If 100 percent code coverage has not been achieved, an error message will be displayed, for example: `ERROR: [125]: c module command/check/check is not fully covered` **Debugging with files** Sometimes it is useful to look at files that were generated during the test. The default for running any test is that, at the start/end of the test, the test harness will clean up all files and directories created. To override this behavior, a single test run must be specified and the option `--no-cleanup` provided. Again, continuing with the check command, from **define.yaml** above, there are four tests. Below, test one will be run and nothing will be cleaned up so that the files and directories in `test/test-0` can be inspected. ``` pgbackrest/test/test.pl --vm-out --module=command --test=check --run=1 --no-cleanup ``` ### Understanding Test Output The following is a small sample of a typical test output. ``` run 8 - expireTimeBasedBackup() run 8/1 ------------- L2285 no current backups 000.002s L2298 empty backup.info 000.009s 000.007s L2300 no backups to expire ``` **run 8 - expireTimeBasedBackup()** - indicates the run number (8) within the module and the parameter provided to testBegin, e.g. `testBegin("expireTimeBasedBackup()")` **run 8/1 ------------- L2285 no current backups** - this is the first test (1) in run 8 which is the `TEST_TITLE("no current backups");` at line number 2285. **000.002s L2298 empty backup.info** - the first number, 000.002s, is the time in seconds that the test started from the beginning of the run. L2298 is the line number of the test and `empty backup.info` is the test comment. **000.009s 000.007s L2300 no backups to expire** - again, 000.009s, is the time in seconds that the test started from the beginning of the run. The second number, 000.007s, is the run time of the **previous** test (i.e. `empty backup.info` test took 000.007 seconds to execute). L2300 is the line number of the test and `no backups to expire` is the test comment. ## Adding an Option Options can be added to a command or multiple commands. Options can be configuration file only, command-line only or valid for both. Once an option is successfully added, the `config.auto.h` and `parse.auto.c` files will automatically be generated by the build system. To add an option, two files need be to be modified: - `src/build/config/config.yaml` - `src/build/help/help.xml` These files are discussed in the following sections along with how to verify the `help` command output. ### config.yaml There are detailed comment blocks above each section that explain the rules for defining commands and options. Regarding options, there are two types: 1) command line only, and 2) configuration file. With the exception of secrets, all configuration file options can be passed on the command line. To configure an option for the configuration file, the `section:` key must be present. The `option:` section is broken into sub-sections by a simple comment divider (e.g. `# Repository options`) under which the options are organized alphabetically by option name. To better explain this section, two hypothetical examples will be discussed. For more details, see [config.yaml](https://github.com/pgbackrest/pgbackrest/blob/main/src/build/config/config.yaml). #### EXAMPLE 1 hypothetical command line only option ``` set: type: string command: backup: depend: option: stanza required: false restore: default: latest command-role: main: {} ``` Note that `section:` is not present thereby making this a command-line only option defined as follows: - `set` - the name of the option - `type` - the type of the option. Valid values for types are: `boolean`, `hash`, `integer`, `list`, `path`, `size`, `string`, and `time` - `command` - list each command for which the option is valid. If a command is not listed, then the option is not valid for the command and an error will be thrown if it is attempted to be used for that command. In this case the valid commands are `backup` and `restore`. - `backup` - details the requirements for the `--set` option for the `backup` command. It is dependent on the option `--stanza`, meaning it is only allowed to be specified for the `backup` command if the `--stanza` option has been specified. And `required: false` indicates that the `--set` option is never required, even with the dependency. - `restore` - details the requirements for the `--set` option for the `restore` command. Since `required:` is omitted, it is not required to be set by the user but it is required by the command and will default to `latest` if it has not been specified by the user. - `command-role` - defines the processes for which the option is valid. `main` indicates the option will be used by the main process and not be passed on to other local/remote processes. #### EXAMPLE 2 hypothetical configuration file option ``` repo-test-type: section: global type: string group: repo default: full allow-list: - full - diff - incr command: backup: {} restore: {} command-role: main: {} ``` - `repo-test-type` - the name of the option - `section` - the section of the configuration file where this option is valid (omitted for command line only options, see [Example 1](#example-1-hypothetical-command-line-only-option) above) - `type` - the type of the option. Valid values for types are: `boolean`, `hash`, `integer`, `list`, `path`, `size`, `string`, and `time` - `group` - indicates that this option is part of the `repo` group of indexed options and therefore will follow the indexing rules e.g. `repo1-test-type`. - `default` - sets a default for the option if the option is not provided when the command is run. The default can be global (as it is here) or it can be specified for a specific command in the command section (as in [Example 1](#example-1-hypothetical-command-line-only-option) above). - `allow-list` - lists the allowable values for the option for all commands for which the option is valid. - `command` - list each command for which the option is valid. If a command is not listed, then the option is not valid for the command and an error will be thrown if it is attempted to be used for that command. In this case the valid commands are `backup` and `restore`. - `command-role` - defines the processes for which the option is valid. `main` indicates the option will be used by the main process and not be passed on to other local/remote processes. At compile time, the `config.auto.h` file will be generated to contain the constants used for options in the code. For the C enums, any dashes in the option name will be removed, camel-cased and prefixed with `cfgOpt`, e.g. `repo-path` becomes `cfgOptRepoPath`. ### help.xml All options must be documented or the system will error during the build. To add an option, find the command section identified by `command id="COMMAND"` section where `COMMAND` is the name of the command (e.g. `expire`) or, if the option is used by more than one command and the definition for the option is the same for all of the commands, the `operation-general title="General Options"` section. To add an option, add the following to the `` section; if it does not exist, then wrap the following in `` ``. This example uses the boolean option `force` of the `restore` command. Simply replace that with your new option and the appropriate `summary`, `text` and `example`. ``` ``` > **IMPORTANT:** A period (.) is required to end the `summary` section. ### Testing the help It is important to run the `help` command unit test after adding an option in case a change is required: ``` pgbackrest/test/test.pl --module=command --test=help --vm-out ``` To verify the `help` command output, build the pgBackRest executable: ``` pgbackrest/test/test.pl --vm=none --build-only ``` Use the pgBackRest executable to test the help output: ``` test/bin/none/pgbackrest help backup repo-type ``` ### Testing the documentation To quickly view the HTML documentation, the `--no-exe` option can be passed to the documentation generator in order to bypass executing the code elements: ``` pgbackrest/doc/doc.pl --out=html --no-exe ``` The generated HTML files will be placed in the `doc/output/html` directory where they can be viewed locally in a browser. If Docker is installed, it will be used by the documentation generator to execute the code elements while building the documentation, therefore, the `--no-exe` should be omitted, (i.e. `pgbackrest/doc/doc.pl --output=html`). `--no-cache` may be used to force a full build even when no code elements have changed since the last build. `--pre` will reuse the container definitions from the prior build and saves time during development. The containers created for documentation builds can be useful for manually testing or trying out new code or features. The following demonstrates building through just the `quickstart` section of the `user-guide` without encryption. ``` pgbackrest/doc/doc.pl --out=html --include=user-guide --require=/quickstart --var=encrypt=n --no-cache --pre ``` The resulting Docker containers can be listed with `docker ps` and the container can be entered with `docker exec doc-pg-primary bash`. Additionally, the `-u` option can be added for entering the container as a specific user (e.g. `postgres`). ## Submitting a Pull Request Before submitting a Pull Request: - Does it meet the [coding standards](https://github.com/pgbackrest/pgbackrest/blob/main/CODING.md)? - Have [Unit Tests](#writing-a-unit-test) been written and [run](#running-a-unit-test) with 100% coverage? - If your submission includes changes to the help or online documentation, have the [help](#testing-the-help) and [documentation](#testing-the-documentation) tests been run? - Has it passed continuous integration testing? Simply renaming your branch with the appendix `-cig` and pushing it to your GitHub account will initiate GitHub Actions to run CI tests. When submitting a Pull Request: - Provide a short submission title. - Write a detailed comment to describe the purpose of your submission and any issue(s), if any, it is resolving; a link to the GitHub issue is also helpful. After submitting a Pull Request: - One or more reviewers will be assigned. - Respond to any issues (conversations) in GitHub but do not resolve the conversation; the reviewer is responsible for ensuring the issue raised has been resolved and marking the conversation resolved. It is helpful to supply the commit in your reply if one was submitted to fix the issue. Lastly, thank you for contributing to pgBackRest! pgbackrest-release-2.37/LICENSE000066400000000000000000000022201416457663300162300ustar00rootroot00000000000000The MIT License (MIT) Portions Copyright (c) 2015-2022, The PostgreSQL Global Development Group Portions Copyright (c) 2013-2022, David Steele Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. pgbackrest-release-2.37/README.md000066400000000000000000000226071416457663300165150ustar00rootroot00000000000000# pgBackRest
Reliable PostgreSQL Backup & Restore ## Introduction pgBackRest aims to be a reliable, easy-to-use backup and restore solution that can seamlessly scale up to the largest databases and workloads by utilizing algorithms that are optimized for database-specific requirements. pgBackRest [v2.37](https://github.com/pgbackrest/pgbackrest/releases/tag/release/2.37) is the current stable release. Release notes are on the [Releases](http://www.pgbackrest.org/release.html) page. Please find us on [GitHub](https://github.com/pgbackrest/pgbackrest) and give us a star if you like pgBackRest! ## Features ### Parallel Backup & Restore Compression is usually the bottleneck during backup operations but, even with now ubiquitous multi-core servers, most database backup solutions are still single-process. pgBackRest solves the compression bottleneck with parallel processing. Utilizing multiple cores for compression makes it possible to achieve 1TB/hr raw throughput even on a 1Gb/s link. More cores and a larger pipe lead to even higher throughput. ### Local or Remote Operation A custom protocol allows pgBackRest to backup, restore, and archive locally or remotely via TLS/SSH with minimal configuration. An interface to query PostgreSQL is also provided via the protocol layer so that remote access to PostgreSQL is never required, which enhances security. ### Multiple Repositories Multiple repositories allow, for example, a local repository with minimal retention for fast restores and a remote repository with a longer retention for redundancy and access across the enterprise. ### Full, Incremental, & Differential Backups Full, differential, and incremental backups are supported. pgBackRest is not susceptible to the time resolution issues of rsync, making differential and incremental backups completely safe. ### Backup Rotation & Archive Expiration Retention polices can be set for full and differential backups to create coverage for any timeframe. WAL archive can be maintained for all backups or strictly for the most recent backups. In the latter case WAL required to make older backups consistent will be maintained in the archive. ### Backup Integrity Checksums are calculated for every file in the backup and rechecked during a restore. After a backup finishes copying files, it waits until every WAL segment required to make the backup consistent reaches the repository. Backups in the repository are stored in the same format as a standard PostgreSQL cluster (including tablespaces). If compression is disabled and hard links are enabled it is possible to snapshot a backup in the repository and bring up a PostgreSQL cluster directly on the snapshot. This is advantageous for terabyte-scale databases that are time consuming to restore in the traditional way. All operations utilize file and directory level fsync to ensure durability. ### Page Checksums PostgreSQL has supported page-level checksums since 9.3. If page checksums are enabled pgBackRest will validate the checksums for every file that is copied during a backup. All page checksums are validated during a full backup and checksums in files that have changed are validated during differential and incremental backups. Validation failures do not stop the backup process, but warnings with details of exactly which pages have failed validation are output to the console and file log. This feature allows page-level corruption to be detected early, before backups that contain valid copies of the data have expired. ### Backup Resume An aborted backup can be resumed from the point where it was stopped. Files that were already copied are compared with the checksums in the manifest to ensure integrity. Since this operation can take place entirely on the backup server, it reduces load on the database server and saves time since checksum calculation is faster than compressing and retransmitting data. ### Streaming Compression & Checksums Compression and checksum calculations are performed in stream while files are being copied to the repository, whether the repository is located locally or remotely. If the repository is on a backup server, compression is performed on the database server and files are transmitted in a compressed format and simply stored on the backup server. When compression is disabled a lower level of compression is utilized to make efficient use of available bandwidth while keeping CPU cost to a minimum. ### Delta Restore The manifest contains checksums for every file in the backup so that during a restore it is possible to use these checksums to speed processing enormously. On a delta restore any files not present in the backup are first removed and then checksums are taken for the remaining files. Files that match the backup are left in place and the rest of the files are restored as usual. Parallel processing can lead to a dramatic reduction in restore times. ### Parallel, Asynchronous WAL Push & Get Dedicated commands are included for pushing WAL to the archive and getting WAL from the archive. Both commands support parallelism to accelerate processing and run asynchronously to provide the fastest possible response time to PostgreSQL. WAL push automatically detects WAL segments that are pushed multiple times and de-duplicates when the segment is identical, otherwise an error is raised. Asynchronous WAL push allows transfer to be offloaded to another process which compresses WAL segments in parallel for maximum throughput. This can be a critical feature for databases with extremely high write volume. Asynchronous WAL get maintains a local queue of WAL segments that are decompressed and ready for replay. This reduces the time needed to provide WAL to PostgreSQL which maximizes replay speed. Higher-latency connections and storage (such as S3) benefit the most. The push and get commands both ensure that the database and repository match by comparing PostgreSQL versions and system identifiers. This virtually eliminates the possibility of misconfiguring the WAL archive location. ### Tablespace & Link Support Tablespaces are fully supported and on restore tablespaces can be remapped to any location. It is also possible to remap all tablespaces to one location with a single command which is useful for development restores. File and directory links are supported for any file or directory in the PostgreSQL cluster. When restoring it is possible to restore all links to their original locations, remap some or all links, or restore some or all links as normal files or directories within the cluster directory. ### S3, Azure, and GCS Compatible Object Store Support pgBackRest repositories can be located in S3, Azure, and GCS compatible object stores to allow for virtually unlimited capacity and retention. ### Encryption pgBackRest can encrypt the repository to secure backups wherever they are stored. ### Compatibility with PostgreSQL >= 8.3 pgBackRest includes support for versions down to 8.3, since older versions of PostgreSQL are still regularly utilized. ## Getting Started pgBackRest strives to be easy to configure and operate: - [User guides](http://www.pgbackrest.org/user-guide-index.html) for various operating systems and PostgreSQL versions. - [Command reference](http://www.pgbackrest.org/command.html) for command-line operations. - [Configuration reference](http://www.pgbackrest.org/configuration.html) for creating pgBackRest configurations. Documentation for v1 can be found [here](http://www.pgbackrest.org/1). No further releases are planned for v1 because v2 is backward-compatible with v1 options and repositories. ## Contributions Contributions to pgBackRest are always welcome! Please see our [Contributing Guidelines](https://github.com/pgbackrest/pgbackrest/blob/main/CONTRIBUTING.md) for details on how to contribute features, improvements or issues. ## Support pgBackRest is completely free and open source under the [MIT](https://github.com/pgbackrest/pgbackrest/blob/main/LICENSE) license. You may use it for personal or commercial purposes without any restrictions whatsoever. Bug reports are taken very seriously and will be addressed as quickly as possible. Creating a robust disaster recovery policy with proper replication and backup strategies can be a very complex and daunting task. You may find that you need help during the architecture phase and ongoing support to ensure that your enterprise continues running smoothly. [Crunchy Data](http://www.crunchydata.com) provides packaged versions of pgBackRest for major operating systems and expert full life-cycle commercial support for pgBackRest and all things PostgreSQL. [Crunchy Data](http://www.crunchydata.com) is committed to providing open source solutions with no vendor lock-in, ensuring that cross-compatibility with the community version of pgBackRest is always strictly maintained. Please visit [Crunchy Data](http://www.crunchydata.com) for more information. ## Recognition Primary recognition goes to Stephen Frost for all his valuable advice and criticism during the development of pgBackRest. [Crunchy Data](http://www.crunchydata.com) has contributed significant time and resources to pgBackRest and continues to actively support development. [Resonate](http://www.resonate.com) also contributed to the development of pgBackRest and allowed early (but well tested) versions to be installed as their primary PostgreSQL backup solution. [Armchair](https://thenounproject.com/search/?q=lounge+chair&i=129971) graphic by [Sandor Szabo](https://thenounproject.com/sandorsz). pgbackrest-release-2.37/doc/000077500000000000000000000000001416457663300157745ustar00rootroot00000000000000pgbackrest-release-2.37/doc/.gitignore000066400000000000000000000000101416457663300177530ustar00rootroot00000000000000output/ pgbackrest-release-2.37/doc/NEWS.md000066400000000000000000000063141416457663300170760ustar00rootroot00000000000000**May 3, 2021**: [Crunchy Data](https://www.crunchydata.com) is pleased to announce the release of [pgBackRest](https://pgbackrest.org/) 2.33, the latest version of the reliable, easy-to-use backup and restore solution that can seamlessly scale up to the largest databases and workloads. pgBackRest has recently introduced many exciting new features including multiple repository support, GCS support for repository storage, automatic temporary S3 credentials, repository list/get commands, and page checksum error reporting. pgBackRest supports a robust set of features for managing your backup and recovery infrastructure, including: parallel backup/restore, full/differential/incremental backups, multiple repositories, delta restore, parallel asynchronous archiving, per-file checksums, page checksums (when enabled) validated during backup, multiple compression types, encryption, partial/failed backup resume, backup from standby, tablespace and link support, S3/Azure/GCS support, backup expiration, local/remote operation via SSH, flexible configuration, and more. You can install pgBackRest from the [PostgreSQL Yum Repository](https://yum.postgresql.org/) or the [PostgreSQL APT Repository](https://apt.postgresql.org). Source code can be downloaded from [releases](https://github.com/pgbackrest/pgbackrest/releases). ## Major New Features ### Multiple Repository Support Backups already provide redundancy by creating an offline copy of a PostgreSQL cluster that can be used in disaster recovery. Multiple repositories allow multiple copies of backups and WAL archives in separate locations to increase redundancy and provide even more protection for valuable data. See [User Guide](https://pgbackrest.org/user-guide.html#multi-repo) and [Blog](https://blog.crunchydata.com/blog/introducing-pgbackrest-multiple-repository-support). ### GCS Support for Repository Storage Repositories may now be located on Google Cloud Storage using service key authentication. See [User Guide](https://pgbackrest.org/user-guide.html#gcs-support) and [Blog](https://blog.crunchydata.com/blog/announcing-google-cloud-storage-gcs-support-for-pgbackrest). ### Automatic Temporary S3 Credentials Temporary credentials will automatically be retrieved when a role with the required permissions is associated with an instance in AWS. See [User Guide](https://pgbackrest.org/user-guide.html#s3-support). ### Repository List/Get Commands The `repo-ls` and `repo-get` commands allow the contents of any repository to be listed and fetched, respectively, regardless of which storage type is used for the repository. See [Command Reference](https://pgbackrest.org/command.html). ### Page Checksum Error Reporting Page checksum errors are included when getting detailed information for a backup using the `--set` option of the `info` command. See [Command Reference](https://pgbackrest.org/command.html#command-info). ## Links - [Website](https://pgbackrest.org) - [User Guides](https://pgbackrest.org/user-guide-index.html) - [Release Notes](https://pgbackrest.org/release.html) - [Support](http://pgbackrest.org/#support) [Crunchy Data](https://www.crunchydata.com) is proud to support the development and maintenance of [pgBackRest](https://github.com/pgbackrest/pgbackrest). pgbackrest-release-2.37/doc/README.md000066400000000000000000000070401416457663300172540ustar00rootroot00000000000000# pgBackRest
Building Documentation ## General Builds The pgBackRest documentation can output a variety of formats and target several platforms and PostgreSQL versions. This will build all documentation with defaults: ```bash ./doc.pl ``` The user guide can be built for `rhel` and `debian`. This will build the HTML user guide for RHEL: ```bash ./doc.pl --out=html --include=user-guide --var=os-type=rhel ``` Documentation generation will build a cache of all executed statements and use the cache to build the documentation quickly if no executed statements have changed. This makes proofing text-only edits very fast, but sometimes it is useful to do a full build without using the cache: ```bash ./doc.pl --out=html --include=user-guide --var=os-type=rhel --no-cache ``` Each `os-type` has a default container image that will be used as a base for creating hosts but it may be useful to change the image. ```bash ./doc.pl --out=html --include=user-guide --var=os-type=debian --var=os-image=debian:9 ./doc.pl --out=html --include=user-guide --var=os-type=rhel --var=os-image=centos:7 ``` The following is a sample RHEL 7 configuration that can be used for building the documentation. ```bash # Install docker sudo yum install -y yum-utils device-mapper-persistent-data lvm2 sudo yum-config-manager --add-repo https://download.docker.com/linux/centos/docker-ce.repo sudo yum install -y docker-ce sudo systemctl start docker # Install tools sudo yum install -y git wget # Install latex (for building PDF) sudo yum install -y texlive texlive-titlesec texlive-sectsty texlive-framed texlive-epstopdf ghostscript # Install Perl modules via CPAN that do not have packages sudo yum install -y yum cpanminus sudo yum groupinstall -y "Development Tools" "Development Libraries" sudo cpanm install --force XML::Checker::Parser # Add documentation test user sudo groupadd test sudo adduser -gtest -n testdoc sudo usermod -aG docker testdoc ``` ## Building with Packages A user-specified package can be used when building the documentation. Since the documentation exercises most pgBackRest functionality this is a great way to smoke-test packages. The package must be located within the pgBackRest repo and the specified path should be relative to the repository base. `test/package` is a good default path to use. Ubuntu 16.04: ```bash ./doc.pl --out=html --include=user-guide --no-cache --var=os-type=debian --var=os-image=ubuntu:16.04 --var=package=test/package/pgbackrest_2.08-0_amd64.deb ``` RHEL 7: ```bash ./doc.pl --out=html --include=user-guide --no-cache --var=os-type=rhel --var=os-image=centos:7 --var=package=test/package/pgbackrest-2.08-1.el7.x86_64.rpm ``` RHEL 8: ```bash ./doc.pl --out=html --include=user-guide --no-cache --var=os-type=rhel --var=os-image=centos:8 --var=package=test/package/pgbackrest-2.08-1.el8.x86_64.rpm ``` Packages can be built with `test.pl` using the following configuration on top of the configuration given for building the documentation. ```bash # Install recent git sudo yum remove -y git sudo yum install -y https://centos7.iuscommunity.org/ius-release.rpm sudo yum install -y git2u-all # Install Perl modules sudo yum install -y perl-ExtUtils-ParseXS perl-ExtUtils-Embed perl-ExtUtils-MakeMaker perl-YAML-LibYAML # Install dev libraries sudo yum install -y libxml2-devel openssl-devel # Add test user with sudo privileges sudo adduser -gtest -n test sudo usermod -aG docker test sudo chmod 750 /home/test sudo echo 'test ALL=(ALL) NOPASSWD: ALL' > /etc/sudoers.d/pgbackrest # Add pgbackrest user required by tests sudo adduser -gtest -n pgbackrest ``` pgbackrest-release-2.37/doc/RELEASE.md000066400000000000000000000144551416457663300174070ustar00rootroot00000000000000# Release Build Instructions ## Set location of the `pgbackrest` repo This makes the rest of the commands in the document easier to run (change to your repo path): ``` export PGBR_REPO=~/pgbackrest ``` ## Create a branch to test the release ``` git checkout -b release-ci ``` ## Update the date, version, and release title Edit the latest release in `doc/xml/release.xml`, e.g.: ``` ``` to: ``` ``` Edit version in `src/version.h`, e.g.: ``` #define PROJECT_VERSION "2.14dev" ``` to: ``` #define PROJECT_VERSION "2.14" ``` ## Update code counts ``` ${PGBR_REPO?}/test/test.pl --code-count ``` ## Build release documentation. Be sure to install latex using the instructions from the Vagrantfile before running this step. ``` ${PGBR_REPO?}/doc/release.pl --build ``` ## Commit release branch and push to CI for testing ``` git commit -m "Release test" git push origin release-ci ``` ## Perform stress testing on release - Build the documentation with stress testing enabled: ``` ${PGBR_REPO?}/doc/doc.pl --out=html --include=user-guide --require=/stress --var=stress=y --var=stress-scale-table=100 --var=stress-scale-data=1000 --pre --no-cache ``` During data load the archive-push and archive-get processes can be monitored with: ``` docker exec -it doc-pg-primary tail -f /var/log/pgbackrest/demo-archive-push-async.log docker exec -it doc-pg-standby tail -f /var/log/pgbackrest/demo-archive-get-async.log ``` During backup/restore the processes can be monitored with: ``` docker exec -it doc-repository tail -f /var/log/pgbackrest/demo-backup.log docker exec -it doc-pg-standby tail -f /var/log/pgbackrest/demo-restore.log ``` Processes can generally be monitored using 'top'. Once `top` is running, press `o` then enter `COMMAND=pgbackrest`. This will filter output to pgbackrest processes. - Check for many log entries in the `archive-push`/`archive-get` logs to ensure async archiving was enabled: ``` docker exec -it doc-pg-primary vi /var/log/pgbackrest/demo-archive-push-async.log docker exec -it doc-pg-standby vi /var/log/pgbackrest/demo-archive-get-async.log ``` - Check the backup log to ensure the correct tables/data were created and backed up. It should look something like: ``` INFO: full backup size = 14.9GB, file total = 101004 ``` - Check the restore log to ensure the correct tables/data were restored. The size and file total should match exactly. ## Clone web documentation into `doc/site` ``` cd ${PGBR_REPO?}/doc git clone git@github.com:pgbackrest/website.git site ``` ## Deploy web documentation to `doc/site` ``` ${PGBR_REPO?}/doc/release.pl --deploy ``` ## Final commit of release to integration Create release notes based on the pattern in prior git commits (this should be automated at some point), e.g. ``` v2.14: Bug Fix and Improvements Bug Fixes: * Fix segfault when process-max > 8 for archive-push/archive-get. (Reported by User.) Improvements: * Bypass database checks when stanza-delete issued with force. (Contributed by User. Suggested by User.) * Add configure script for improved multi-platform support. Documentation Features: * Add user guide for Debian. ``` Commit to integration with the above message and push to CI. ## Push to main Push release commit to main once CI testing is complete. ## Create release on github Create release notes based on pattern in prior releases (this should be automated at some point), e.g. ``` v2.14: Bug Fix and Improvements **Bug Fixes**: - Fix segfault when process-max > 8 for archive-push/archive-get. (Reported by User.) **Improvements**: - Bypass database checks when stanza-delete issued with force. (Contributed by User. Suggested by User.) - Add configure script for improved multi-platform support. **Documentation Features**: - Add user guide for Debian. ``` The first line will be the release title and the rest will be the body. The tag field should be updated with the current version so a tag is created from main. **Be sure to select the release commit explicitly rather than auto-tagging the last commit in main!** ## Push web documentation to main and deploy ``` cd ${PGBR_REPO?}/doc/site git commit -m "v2.14 documentation." git push origin main ``` Deploy the documentation on `pgbackrest.org`. ## Notify packagers of new release ## Announce release on Twitter ## Publish a postgresql.org news item when there are major new features Start from NEWS.md and update with the new date, version, and interesting features added since the last release. News items are automatically sent to the `pgsql-announce` mailing list once they have been approved. ## Prepare for the next release Add new release in `doc/xml/release.xml`, e.g.: ``` ``` Edit version in `src/version.h`, e.g.: ``` #define PROJECT_VERSION "2.14" ``` to: ``` #define PROJECT_VERSION "2.15dev" ``` Run deploy to generate git history (ctrl-c as soon as the file is generated): ``` ${PGBR_REPO?}/doc/release.pl --build ``` Commit and push to integration: ``` git commit -m "Begin v2.15 development." git push origin integration ``` ## Update automake/config scripts These scripts are required by `src/config` and should be updated after each release, when needed. Note that these files are updated very infrequently. Check the latest version of `automake` and see if it is > `1.16.5`: ``` https://git.savannah.gnu.org/gitweb/?p=automake.git ``` If so, update the version above and copy `lib/install-sh` from the `automake` repo to the `pgbackrest` repo at `[repo]/src/build/install-sh`: ``` wget -O ${PGBR_REPO?}/src/build/install-sh '[URL]' ``` Get the latest versions of `config.sub` and `config.guess`. These files are not versioned so the newest version is pulled at the beginning of the release cycle to allow time to test stability. ``` wget -O ${PGBR_REPO?}/src/build/config.guess 'https://git.savannah.gnu.org/gitweb/?p=config.git;a=blob_plain;f=config.guess;hb=HEAD' wget -O ${PGBR_REPO?}/src/build/config.sub 'https://git.savannah.gnu.org/gitweb/?p=config.git;a=blob_plain;f=config.sub;hb=HEAD' ``` pgbackrest-release-2.37/doc/doc.pl000077500000000000000000000314431416457663300171060ustar00rootroot00000000000000#!/usr/bin/perl #################################################################################################################################### # doc.pl - PgBackRest Doc Builder #################################################################################################################################### #################################################################################################################################### # Perl includes #################################################################################################################################### use strict; use warnings FATAL => qw(all); use Carp qw(confess); use English '-no_match_vars'; $SIG{__DIE__} = sub { Carp::confess @_ }; use Cwd qw(abs_path); use File::Basename qw(dirname); use Getopt::Long qw(GetOptions); use Pod::Usage qw(pod2usage); use Storable; use lib dirname(abs_path($0)) . '/lib'; use lib dirname(dirname(abs_path($0))) . '/lib'; use lib dirname(dirname(abs_path($0))) . '/build/lib'; use lib dirname(dirname(abs_path($0))) . '/test/lib'; use pgBackRestTest::Common::ExecuteTest; use pgBackRestTest::Common::Storage; use pgBackRestTest::Common::StoragePosix; use pgBackRestDoc::Common::Doc; use pgBackRestDoc::Common::DocConfig; use pgBackRestDoc::Common::DocManifest; use pgBackRestDoc::Common::DocRender; use pgBackRestDoc::Common::Exception; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; use pgBackRestDoc::Html::DocHtmlSite; use pgBackRestDoc::Latex::DocLatex; use pgBackRestDoc::Markdown::DocMarkdown; use pgBackRestDoc::ProjectInfo; #################################################################################################################################### # Usage #################################################################################################################################### =head1 NAME doc.pl - Generate pgBackRest documentation =head1 SYNOPSIS doc.pl [options] General Options: --help Display usage and exit --version Display pgBackRest version --quiet Sets log level to ERROR --log-level Log level for execution (e.g. ERROR, WARN, INFO, DEBUG) --deploy Write exe.cache into resource for persistence --no-exe Should commands be executed when building help? (for testing only) --no-cache Don't use execution cache --cache-only Only use the execution cache - don't attempt to generate it --pre Pre-build containers for execute elements marked pre --var Override defined variable --key-var Override defined variable and use in cache key --doc-path Document path to render (manifest.xml should be located here) --out Output types (html, pdf, markdown) --out-preserve Don't clean output directory --require Require only certain sections of the document (to speed testing) --include Include source in generation (links will reference website) --exclude Exclude source from generation (links will reference website) Variable Options: --dev Set 'dev' variable to 'y' --debug Set 'debug' variable to 'y' =cut #################################################################################################################################### # Load command line parameters and config (see usage above for details) #################################################################################################################################### my $bHelp = false; my $bVersion = false; my $bQuiet = false; my $strLogLevel = 'info'; my $bNoExe = false; my $bNoCache = false; my $bCacheOnly = false; my $rhVariableOverride = {}; my $rhKeyVariableOverride = {}; my $strDocPath; my @stryOutput; my $bOutPreserve = false; my @stryRequire; my @stryInclude; my @stryExclude; my $bDeploy = false; my $bDev = false; my $bDebug = false; my $bPre = false; GetOptions ('help' => \$bHelp, 'version' => \$bVersion, 'quiet' => \$bQuiet, 'log-level=s' => \$strLogLevel, 'out=s@' => \@stryOutput, 'out-preserve' => \$bOutPreserve, 'require=s@' => \@stryRequire, 'include=s@' => \@stryInclude, 'exclude=s@' => \@stryExclude, 'no-exe', \$bNoExe, 'deploy', \$bDeploy, 'no-cache', \$bNoCache, 'dev', \$bDev, 'debug', \$bDebug, 'pre', \$bPre, 'cache-only', \$bCacheOnly, 'key-var=s%', $rhKeyVariableOverride, 'var=s%', $rhVariableOverride, 'doc-path=s', \$strDocPath) or pod2usage(2); #################################################################################################################################### # Run in eval block to catch errors #################################################################################################################################### eval { # Display version and exit if requested if ($bHelp || $bVersion) { print PROJECT_NAME . ' ' . PROJECT_VERSION . " Documentation Builder\n"; if ($bHelp) { print "\n"; pod2usage(); } exit 0; } # Disable cache when no exe if ($bNoExe) { $bNoCache = true; } # Make sure options are set correctly for deploy if ($bDeploy) { my $strError = 'cannot be specified for deploy'; !$bNoExe or confess "--no-exe ${strError}"; !@stryRequire or confess "--require ${strError}"; } # one --include must be specified when --required is if (@stryRequire && @stryInclude != 1) { confess "one --include is required when --require is specified"; } # Set console log level if ($bQuiet) { $strLogLevel = 'error'; } # If --dev passed then set the dev var to 'y' if ($bDev) { $rhVariableOverride->{'dev'} = 'y'; } # If --debug passed then set the debug var to 'y' if ($bDebug) { $rhVariableOverride->{'debug'} = 'y'; } # Doesn't make sense to pass include and exclude if (@stryInclude > 0 && @stryExclude > 0) { confess "cannot specify both --include and --exclude"; } logLevelSet(undef, uc($strLogLevel), OFF); # Get the base path my $strBasePath = abs_path(dirname($0)); my $oStorageDoc = new pgBackRestTest::Common::Storage( $strBasePath, new pgBackRestTest::Common::StoragePosix({bFileSync => false, bPathSync => false})); if (!defined($strDocPath)) { $strDocPath = $strBasePath; } my $strOutputPath = "${strDocPath}/output"; # Create the out path if it does not exist if (!-e $strOutputPath) { mkdir($strOutputPath) or confess &log(ERROR, "unable to create path ${strOutputPath}"); } # Merge key variables into the variable list and ensure there are no duplicates foreach my $strKey (sort(keys(%{$rhKeyVariableOverride}))) { if (defined($rhVariableOverride->{$strKey})) { confess &log(ERROR, "'${strKey}' cannot be passed as --var and --key-var"); } $rhVariableOverride->{$strKey} = $rhKeyVariableOverride->{$strKey}; } # Load the manifest my $oManifest = new pgBackRestDoc::Common::DocManifest( $oStorageDoc, \@stryRequire, \@stryInclude, \@stryExclude, $rhKeyVariableOverride, $rhVariableOverride, $strDocPath, $bDeploy, $bCacheOnly, $bPre); if (!$bNoCache) { $oManifest->cacheRead(); } # If no outputs were given if (@stryOutput == 0) { @stryOutput = $oManifest->renderList(); if ($oManifest->isBackRest()) { push(@stryOutput, 'man'); } } # Build host containers if (!$bCacheOnly && !$bNoExe) { foreach my $strSource ($oManifest->sourceList()) { if ((@stryInclude == 0 || grep(/$strSource/, @stryInclude)) && !grep(/$strSource/, @stryExclude)) { &log(INFO, "source $strSource"); foreach my $oHostDefine ($oManifest->sourceGet($strSource)->{doc}->nodeList('host-define', false)) { if ($oManifest->evaluateIf($oHostDefine)) { my $strImage = $oManifest->variableReplace($oHostDefine->paramGet('image')); my $strFrom = $oManifest->variableReplace($oHostDefine->paramGet('from')); my $strDockerfile = "${strOutputPath}/doc-host.dockerfile"; &log(INFO, "Build vm '${strImage}' from '${strFrom}'"); $oStorageDoc->put( $strDockerfile, "FROM ${strFrom}\n\n" . trim($oManifest->variableReplace($oHostDefine->valueGet())) . "\n"); executeTest("docker build -f ${strDockerfile} -t ${strImage} ${strBasePath}"); } } } } } # Render output for my $strOutput (@stryOutput) { if (!($strOutput eq 'man' && $oManifest->isBackRest())) { $oManifest->renderGet($strOutput); } &log(INFO, "render ${strOutput} output"); # Clean contents of out directory if (!$bOutPreserve) { my $strOutputPath = $strOutput eq 'pdf' ? "${strOutputPath}/latex" : "${strOutputPath}/$strOutput"; # Clean the current out path if it exists if (-e $strOutputPath) { executeTest("rm -rf ${strOutputPath}/*"); } # Else create the html path else { mkdir($strOutputPath) or confess &log(ERROR, "unable to create path ${strOutputPath}"); } } if ($strOutput eq 'markdown') { my $oMarkdown = new pgBackRestDoc::Markdown::DocMarkdown ( $oManifest, "${strBasePath}/xml", "${strOutputPath}/markdown", !$bNoExe ); $oMarkdown->process(); } elsif ($strOutput eq 'man' && $oManifest->isBackRest()) { # Generate the command-line help my $oRender = new pgBackRestDoc::Common::DocRender('text', $oManifest, !$bNoExe); my $oDocConfig = new pgBackRestDoc::Common::DocConfig( new pgBackRestDoc::Common::Doc("${strBasePath}/../src/build/help/help.xml"), $oRender); $oStorageDoc->pathCreate( "${strBasePath}/output/man", {strMode => '0770', bIgnoreExists => true, bCreateParent => true}); $oStorageDoc->put("${strBasePath}/output/man/" . lc(PROJECT_NAME) . '.1.txt', $oDocConfig->manGet($oManifest)); } elsif ($strOutput eq 'html') { my $oHtmlSite = new pgBackRestDoc::Html::DocHtmlSite ( $oManifest, "${strBasePath}/xml", "${strOutputPath}/html", "${strBasePath}/resource/html/default.css", defined($oManifest->variableGet('project-favicon')) ? "${strBasePath}/resource/html/" . $oManifest->variableGet('project-favicon') : undef, defined($oManifest->variableGet('project-logo')) ? "${strBasePath}/resource/" . $oManifest->variableGet('project-logo') : undef, !$bNoExe ); $oHtmlSite->process(); } elsif ($strOutput eq 'pdf') { my $oLatex = new pgBackRestDoc::Latex::DocLatex ( $oManifest, "${strBasePath}/xml", "${strOutputPath}/latex", "${strBasePath}/resource/latex/preamble.tex", !$bNoExe ); $oLatex->process(); } } # Cache the manifest (mostly useful for testing rendering changes in the code) if (!$bNoCache && !$bCacheOnly) { $oManifest->cacheWrite(); } # Exit with success exit 0; } #################################################################################################################################### # Check for errors #################################################################################################################################### or do { # If a backrest exception then return the code exit $EVAL_ERROR->code() if (isException(\$EVAL_ERROR)); # Else output the unhandled error print $EVAL_ERROR; exit ERROR_UNHANDLED; }; # It shouldn't be possible to get here &log(ASSERT, 'execution reached invalid location in ' . __FILE__ . ', line ' . __LINE__); exit ERROR_ASSERT; pgbackrest-release-2.37/doc/example/000077500000000000000000000000001416457663300174275ustar00rootroot00000000000000pgbackrest-release-2.37/doc/example/pgsql-pgbackrest-info.sql000066400000000000000000000015131416457663300243520ustar00rootroot00000000000000-- An example of monitoring pgBackRest from within PostgreSQL -- -- Use copy to export data from the pgBackRest info command into the jsonb -- type so it can be queried directly by PostgreSQL. -- Create monitor schema create schema monitor; -- Get pgBackRest info in JSON format create function monitor.pgbackrest_info() returns jsonb AS $$ declare data jsonb; begin -- Create a temp table to hold the JSON data create temp table temp_pgbackrest_data (data jsonb); -- Copy data into the table directly from the pgBackRest info command copy temp_pgbackrest_data (data) from program 'pgbackrest --output=json info' (format text); select temp_pgbackrest_data.data into data from temp_pgbackrest_data; drop table temp_pgbackrest_data; return data; end $$ language plpgsql; pgbackrest-release-2.37/doc/example/pgsql-pgbackrest-query.sql000066400000000000000000000011401416457663300245600ustar00rootroot00000000000000-- Get last successful backup for each stanza -- -- Requires the monitor.pgbackrest_info function. with stanza as ( select data->'name' as name, data->'backup'->( jsonb_array_length(data->'backup') - 1) as last_backup, data->'archive'->( jsonb_array_length(data->'archive') - 1) as current_archive from jsonb_array_elements(monitor.pgbackrest_info()) as data ) select name, to_timestamp( (last_backup->'timestamp'->>'stop')::numeric) as last_successful_backup, current_archive->>'max' as last_archived_wal from stanza; pgbackrest-release-2.37/doc/lib/000077500000000000000000000000001416457663300165425ustar00rootroot00000000000000pgbackrest-release-2.37/doc/lib/pgBackRestDoc/000077500000000000000000000000001416457663300212155ustar00rootroot00000000000000pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Common/000077500000000000000000000000001416457663300224455ustar00rootroot00000000000000pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Common/Doc.pm000066400000000000000000000606301416457663300235150ustar00rootroot00000000000000#################################################################################################################################### # DOC MODULE #################################################################################################################################### package pgBackRestDoc::Common::Doc; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use English '-no_match_vars'; use Cwd qw(abs_path); use File::Basename qw(dirname); use Scalar::Util qw(blessed); use XML::Checker::Parser; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Create the class hash my $self = {}; bless $self, $class; $self->{strClass} = $class; # Assign function parameters, defaults, and log debug info ( my $strOperation, $self->{strFileName}, my $strSgmlSearchPath, ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'strFileName', required => false}, {name => 'strSgmlSearchPath', required => false}, ); # Load the doc from a file if one has been defined if (defined($self->{strFileName})) { my $oParser = XML::Checker::Parser->new(ErrorContext => 2, Style => 'Tree'); $oParser->set_sgml_search_path( defined($strSgmlSearchPath) ? $strSgmlSearchPath : dirname(dirname(abs_path($0))) . '/doc/xml/dtd'); my $oTree; eval { local $XML::Checker::FAIL = sub { my $iCode = shift; die XML::Checker::error_string($iCode, @_); }; $oTree = $oParser->parsefile($self->{strFileName}); return true; } # Report any error that stopped parsing or do { my $strException = $EVAL_ERROR; $strException =~ s/at \/.*?$//s; # remove module line number die "malformed xml in '$self->{strFileName}':\n" . trim($strException); }; # Parse and build the doc $self->{oDoc} = $self->build($self->parse(${$oTree}[0], ${$oTree}[1])); } # Else create a blank doc else { $self->{oDoc} = {name => 'doc', children => []}; } $self->{strName} = 'root'; # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # parse # # Parse the xml doc into a more usable hash and array structure. #################################################################################################################################### sub parse { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strName, $oyNode ) = logDebugParam ( __PACKAGE__ . '->parse', \@_, {name => 'strName', trace => true}, {name => 'oyNode', trace => true} ); my %oOut; my $iIndex = 0; my $bText = $strName eq 'text' || $strName eq 'p' || $strName eq 'title' || $strName eq 'summary' || $strName eq 'table-cell' || $strName eq 'table-column' || $strName eq 'list-item' || $strName eq 'admonition'; # Store the node name $oOut{name} = $strName; if (keys(%{$$oyNode[$iIndex]})) { $oOut{param} = $$oyNode[$iIndex]; } $iIndex++; # Look for strings and children while (defined($$oyNode[$iIndex])) { # Process string data if (ref(\$$oyNode[$iIndex]) eq 'SCALAR' && $$oyNode[$iIndex] eq '0') { $iIndex++; my $strBuffer = $$oyNode[$iIndex++]; # Strip tabs, CRs, and LFs $strBuffer =~ s/\t|\r//g; # If anything is left if (length($strBuffer) > 0) { # If text node then create array entries for strings if ($bText) { if (!defined($oOut{children})) { $oOut{children} = []; } push(@{$oOut{children}}, $strBuffer); } # Don't allow strings mixed with children elsif (length(trim($strBuffer)) > 0) { if (defined($oOut{children})) { confess "text mixed with children in node ${strName} (spaces count)"; } if (defined($oOut{value})) { confess "value is already defined in node ${strName} - this shouldn't happen"; } # Don't allow text mixed with $oOut{value} = $strBuffer; } } } # Process a child else { if (defined($oOut{value}) && $bText) { confess "text mixed with children in node ${strName} before child " . $$oyNode[$iIndex++] . " (spaces count)"; } if (!defined($oOut{children})) { $oOut{children} = []; } push(@{$oOut{children}}, $self->parse($$oyNode[$iIndex++], $$oyNode[$iIndex++])); } } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oDoc', value => \%oOut, trace => true} ); } #################################################################################################################################### # build # # Restructure the doc to make walking it easier. #################################################################################################################################### sub build { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oDoc ) = logDebugParam ( __PACKAGE__ . '->build', \@_, {name => 'oDoc', trace => true} ); # Initialize the node object my $oOut = {name => $$oDoc{name}, children => [], value => $$oDoc{value}}; my $strError = "in node $$oDoc{name}"; # Get all params if (defined($$oDoc{param})) { for my $strParam (keys %{$$oDoc{param}}) { $$oOut{param}{$strParam} = $$oDoc{param}{$strParam}; } } if ($$oDoc{name} eq 'p' || $$oDoc{name} eq 'title' || $$oDoc{name} eq 'summary' || $$oDoc{name} eq 'table-cell' || $$oDoc{name} eq 'table-column' || $$oDoc{name} eq 'list-item' || $$oDoc{name} eq 'admonition') { $$oOut{field}{text} = $oDoc; } elsif (defined($$oDoc{children})) { for (my $iIndex = 0; $iIndex < @{$$oDoc{children}}; $iIndex++) { my $oSub = $$oDoc{children}[$iIndex]; my $strName = $$oSub{name}; if ($strName eq 'text') { $$oOut{field}{text} = $oSub; } elsif ((defined($$oSub{value}) && !defined($$oSub{param})) && $strName ne 'code-block') { $$oOut{field}{$strName} = $$oSub{value}; } elsif (!defined($$oSub{children}) && !defined($$oSub{value}) && !defined($$oSub{param})) { $$oOut{field}{$strName} = true; } else { push(@{$$oOut{children}}, $self->build($oSub)); } } } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oDoc', value => $oOut, trace => true} ); } #################################################################################################################################### # nodeGetById # # Return a node by name - error if more than one exists #################################################################################################################################### sub nodeGetById { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strName, $strId, $bRequired, ) = logDebugParam ( __PACKAGE__ . 'nodeGetById', \@_, {name => 'strName', trace => true}, {name => 'strId', required => false, trace => true}, {name => 'bRequired', default => true, trace => true} ); my $oDoc = $self->{oDoc}; my $oNode; for (my $iIndex = 0; $iIndex < @{$$oDoc{children}}; $iIndex++) { if ((defined($strName) && $$oDoc{children}[$iIndex]{name} eq $strName) && (!defined($strId) || $$oDoc{children}[$iIndex]{param}{id} eq $strId)) { if (!defined($oNode)) { $oNode = $$oDoc{children}[$iIndex]; } else { confess "found more than one child ${strName} in node $$oDoc{name}"; } } } if (!defined($oNode) && $bRequired) { confess "unable to find child ${strName}" . (defined($strId) ? " (${strId})" : '') . " in node $$oDoc{name}"; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oNodeDoc', value => $self->nodeBless($oNode), trace => true} ); } #################################################################################################################################### # nodeGet # # Return a node by name - error if more than one exists #################################################################################################################################### sub nodeGet { my $self = shift; return $self->nodeGetById(shift, undef, shift); } #################################################################################################################################### # nodeTest # # Test that a node exists #################################################################################################################################### sub nodeTest { my $self = shift; return defined($self->nodeGetById(shift, undef, false)); } #################################################################################################################################### # nodeAdd # # Add a node to the current doc's child list #################################################################################################################################### sub nodeAdd { my $self = shift; my $strName = shift; my $strValue = shift; my $oParam = shift; my $oField = shift; my $oDoc = $self->{oDoc}; my $oNode = {name => $strName, value => $strValue, param => $oParam, field => $oField}; push(@{$$oDoc{children}}, $oNode); return $self->nodeBless($oNode); } #################################################################################################################################### # nodeBless # # Make a new Doc object from a node. #################################################################################################################################### sub nodeBless { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oNode ) = logDebugParam ( __PACKAGE__ . '->nodeBless', \@_, {name => 'oNode', required => false, trace => true} ); my $oDoc; if (defined($oNode)) { $oDoc = {}; bless $oDoc, $self->{strClass}; $oDoc->{strClass} = $self->{strClass}; $oDoc->{strName} = $$oNode{name}; $oDoc->{oDoc} = $oNode; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oDoc', value => $oDoc, trace => true} ); } #################################################################################################################################### # nodeList # # Get a list of nodes. #################################################################################################################################### sub nodeList { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strName, $bRequired, ) = logDebugParam ( __PACKAGE__ . '->nodeList', \@_, {name => 'strName', required => false, trace => true}, {name => 'bRequired', default => true, trace => true}, ); my $oDoc = $self->{oDoc}; my @oyNode; if (defined($$oDoc{children})) { for (my $iIndex = 0; $iIndex < @{$$oDoc{children}}; $iIndex++) { if (!defined($strName) || $$oDoc{children}[$iIndex]{name} eq $strName) { if (ref(\$$oDoc{children}[$iIndex]) eq "SCALAR") { push(@oyNode, $$oDoc{children}[$iIndex]); } else { push(@oyNode, $self->nodeBless($$oDoc{children}[$iIndex])); } } } } if (@oyNode == 0 && $bRequired) { confess 'unable to find ' . (defined($strName) ? "children named '${strName}'" : 'any children') . " in node $$oDoc{name}"; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oyNode', value => \@oyNode, trace => true} ); } #################################################################################################################################### # nodeRemove # # Remove a child node. #################################################################################################################################### sub nodeRemove { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oChildRemove ) = logDebugParam ( __PACKAGE__ . '->nodeRemove', \@_, {name => 'oChildRemove', required => false, trace => true} ); my $bRemove = false; my $oDoc = $self->{oDoc}; # Error if there are no children if (!defined($$oDoc{children})) { confess &log(ERROR, "node has no children"); } for (my $iIndex = 0; $iIndex < @{$$oDoc{children}}; $iIndex++) { if ($$oDoc{children}[$iIndex] == $oChildRemove->{oDoc}) { splice(@{$$oDoc{children}}, $iIndex, 1); $bRemove = true; last; } } if (!$bRemove) { confess &log(ERROR, "child was not found in node, could not be removed"); } # Return from function and log return values if any return logDebugReturn($strOperation); } #################################################################################################################################### # nodeReplace # # Replace a child node with one or more child nodes. #################################################################################################################################### sub nodeReplace { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oChildRemove, $oyChildReplace, ) = logDebugParam ( __PACKAGE__ . '->nodeReplace', \@_, {name => 'oChildRemove', trace => true}, {name => 'oChildReplace', trace => true}, ); my $bReplace = false; my $iReplaceIdx = undef; my $iReplaceTotal = undef; my $oDoc = $self->{oDoc}; # Error if there are no children if (!defined($$oDoc{children})) { confess &log(ERROR, "node has no children"); } for (my $iIndex = 0; $iIndex < @{$$oDoc{children}}; $iIndex++) { if ($$oDoc{children}[$iIndex] == $oChildRemove->{oDoc}) { splice(@{$$oDoc{children}}, $iIndex, 1); splice(@{$$oDoc{children}}, $iIndex, 0, @{$oyChildReplace}); $iReplaceIdx = $iIndex; $iReplaceTotal = scalar(@{$oyChildReplace}); $bReplace = true; last; } } if (!$bReplace) { confess &log(ERROR, "child was not found in node, could not be replaced"); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'iReplaceIdx', value => $iReplaceIdx, trace => true}, {name => 'iReplaceTotal', value => $iReplaceTotal, trace => true}, ); } #################################################################################################################################### # nameGet #################################################################################################################################### sub nameGet { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->nameGet'); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strName', value => ${$self->{oDoc}}{name}, trace => true} ); } #################################################################################################################################### # valueGet #################################################################################################################################### sub valueGet { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->valueGet'); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strValue', value => ${$self->{oDoc}}{value}, trace => true} ); } #################################################################################################################################### # valueSet #################################################################################################################################### sub valueSet { my $self = shift; my $strValue = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->valueSet'); # Set the value ${$self->{oDoc}}{value} = $strValue; # Return from function and log return values if any return logDebugReturn($strOperation); } #################################################################################################################################### # paramGet # # Get a parameter from a node. #################################################################################################################################### sub paramGet { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strName, $bRequired, $strDefault, $strType ) = logDebugParam ( __PACKAGE__ . '->paramGet', \@_, {name => 'strName', trace => true}, {name => 'bRequired', default => true, trace => true}, {name => 'strDefault', required => false, trace => true}, {name => 'strType', default => 'param', trace => true} ); my $strValue = ${$self->{oDoc}}{$strType}{$strName}; if (!defined($strValue)) { if ($bRequired) { confess "${strType} '${strName}' is required in node '$self->{strName}'"; } $strValue = $strDefault; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strValue', value => $strValue, trace => true} ); } #################################################################################################################################### # paramTest # # Test that a parameter exists or has a certain value. #################################################################################################################################### sub paramTest { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strName, $strExpectedValue, $strType ) = logDebugParam ( __PACKAGE__ . '->paramTest', \@_, {name => 'strName', trace => true}, {name => 'strExpectedValue', required => false, trace => true}, {name => 'strType', default => 'param', trace => true} ); my $bResult = true; my $strValue = $self->paramGet($strName, false, undef, $strType); if (!defined($strValue)) { $bResult = false; } elsif (defined($strExpectedValue) && $strValue ne $strExpectedValue) { $bResult = false; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'bResult', value => $bResult, trace => true} ); } #################################################################################################################################### # paramSet # # Set a parameter in a node. #################################################################################################################################### sub paramSet { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strName, $strValue, $strType ) = logDebugParam ( __PACKAGE__ . '->paramSet', \@_, {name => 'strName', trace => true}, {name => 'strValue', required => false, trace => true}, {name => 'strType', default => 'param', trace => true} ); ${$self->{oDoc}}{$strType}{$strName} = $strValue; # Return from function and log return values if any logDebugReturn($strOperation); } #################################################################################################################################### # fieldGet # # Get a field from a node. #################################################################################################################################### sub fieldGet { my $self = shift; return $self->paramGet(shift, shift, shift, 'field'); } #################################################################################################################################### # fieldTest # # Test if a field exists. #################################################################################################################################### sub fieldTest { my $self = shift; return $self->paramTest(shift, shift, 'field'); } #################################################################################################################################### # textGet # # Get a field from a node. #################################################################################################################################### sub textGet { my $self = shift; return $self->nodeBless($self->paramGet('text', shift, shift, 'field')); } #################################################################################################################################### # textSet # # Get a field from a node. #################################################################################################################################### sub textSet { my $self = shift; my $oText = shift; if (blessed($oText) && $oText->isa('pgBackRestDoc::Common::Doc')) { $oText = $oText->{oDoc}; } elsif (ref($oText) ne 'HASH') { $oText = {name => 'text', children => [$oText]}; } return $self->paramSet('text', $oText, 'field'); } #################################################################################################################################### # fieldSet # # Set a parameter in a node. #################################################################################################################################### sub fieldSet { my $self = shift; $self->paramSet(shift, shift, 'field'); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Common/DocConfig.pm000066400000000000000000001045271416457663300246470ustar00rootroot00000000000000#################################################################################################################################### # DOC CONFIG MODULE #################################################################################################################################### package pgBackRestDoc::Common::DocConfig; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname); use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; use pgBackRestDoc::Custom::DocConfigData; use pgBackRestDoc::ProjectInfo; #################################################################################################################################### # Help types #################################################################################################################################### use constant CONFIG_HELP_COMMAND => 'command'; push @EXPORT, qw(CONFIG_HELP_COMMAND); use constant CONFIG_HELP_CURRENT => 'current'; use constant CONFIG_HELP_DEFAULT => 'default'; use constant CONFIG_HELP_DESCRIPTION => 'description'; push @EXPORT, qw(CONFIG_HELP_DESCRIPTION); use constant CONFIG_HELP_EXAMPLE => 'example'; use constant CONFIG_HELP_INTERNAL => 'internal'; use constant CONFIG_HELP_NAME => 'name'; use constant CONFIG_HELP_NAME_ALT => 'name-alt'; push @EXPORT, qw(CONFIG_HELP_NAME_ALT); use constant CONFIG_HELP_OPTION => 'option'; push @EXPORT, qw(CONFIG_HELP_OPTION); use constant CONFIG_HELP_SECTION => 'section'; push @EXPORT, qw(CONFIG_HELP_SECTION); use constant CONFIG_HELP_SUMMARY => 'summary'; push @EXPORT, qw(CONFIG_HELP_SUMMARY); use constant CONFIG_HELP_SOURCE => 'source'; push @EXPORT, qw(CONFIG_HELP_SOURCE); use constant CONFIG_HELP_SOURCE_DEFAULT => 'default'; use constant CONFIG_HELP_SOURCE_SECTION => CONFIG_HELP_SECTION; use constant CONFIG_HELP_SOURCE_COMMAND => CONFIG_HELP_COMMAND; push @EXPORT, qw(CONFIG_HELP_SOURCE_COMMAND); #################################################################################################################################### # Config Section Types #################################################################################################################################### use constant CFGDEF_COMMAND => 'command'; use constant CFGDEF_GENERAL => 'general'; use constant CFGDEF_LOG => 'log'; use constant CFGDEF_REPOSITORY => 'repository'; #################################################################################################################################### # Option define hash #################################################################################################################################### my $rhConfigDefine = cfgDefine(); #################################################################################################################################### # Returns the option defines based on the command. #################################################################################################################################### sub docConfigCommandDefine { my $strOption = shift; my $strCommand = shift; if (defined($strCommand)) { return defined($rhConfigDefine->{$strOption}{&CFGDEF_COMMAND}) && defined($rhConfigDefine->{$strOption}{&CFGDEF_COMMAND}{$strCommand}) && ref($rhConfigDefine->{$strOption}{&CFGDEF_COMMAND}{$strCommand}) eq 'HASH' ? $rhConfigDefine->{$strOption}{&CFGDEF_COMMAND}{$strCommand} : undef; } return; } #################################################################################################################################### # Does the option have a default for this command? #################################################################################################################################### sub docConfigOptionDefault { my $strOption = shift; my $strCommand = shift; # Get the command define my $oCommandDefine = docConfigCommandDefine($strOption, $strCommand); # Check for default in command my $strDefault = defined($oCommandDefine) ? $$oCommandDefine{&CFGDEF_DEFAULT} : undef; # If defined return, else try to grab the global default return defined($strDefault) ? $strDefault : $rhConfigDefine->{$strOption}{&CFGDEF_DEFAULT}; } push @EXPORT, qw(docConfigOptionDefault); #################################################################################################################################### # Get the allowed setting range for the option if it exists #################################################################################################################################### sub docConfigOptionRange { my $strOption = shift; my $strCommand = shift; # Get the command define my $oCommandDefine = docConfigCommandDefine($strOption, $strCommand); # Check for default in command if (defined($oCommandDefine) && defined($$oCommandDefine{&CFGDEF_ALLOW_RANGE})) { return $$oCommandDefine{&CFGDEF_ALLOW_RANGE}[0], $$oCommandDefine{&CFGDEF_ALLOW_RANGE}[1]; } # If defined return, else try to grab the global default return $rhConfigDefine->{$strOption}{&CFGDEF_ALLOW_RANGE}[0], $rhConfigDefine->{$strOption}{&CFGDEF_ALLOW_RANGE}[1]; } push @EXPORT, qw(docConfigOptionRange); #################################################################################################################################### # Get the option type #################################################################################################################################### sub docConfigOptionType { my $strOption = shift; return $rhConfigDefine->{$strOption}{&CFGDEF_TYPE}; } push @EXPORT, qw(docConfigOptionType); #################################################################################################################################### # Test the option type #################################################################################################################################### sub docConfigOptionTypeTest { my $strOption = shift; my $strType = shift; return docConfigOptionType($strOption) eq $strType; } push @EXPORT, qw(docConfigOptionTypeTest); #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Create the class hash my $self = {}; bless $self, $class; # Assign function parameters, defaults, and log debug info ( my $strOperation, $self->{oDoc}, $self->{oDocRender} ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'oDoc'}, {name => 'oDocRender', required => false} ); $self->process(); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # process # # Parse the xml doc into commands and options. #################################################################################################################################### sub process { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->process'); # Iterate through all commands my $oDoc = $self->{oDoc}; my $oConfigHash = {}; foreach my $strCommand (cfgDefineCommandList()) { my $oCommandDoc = $oDoc->nodeGet('operation')->nodeGet('command-list')->nodeGetById('command', $strCommand); $$oConfigHash{&CONFIG_HELP_COMMAND}{$strCommand} = {}; my $oCommand = $$oConfigHash{&CONFIG_HELP_COMMAND}{$strCommand}; $$oCommand{&CONFIG_HELP_SUMMARY} = $oCommandDoc->nodeGet('summary')->textGet(); $$oCommand{&CONFIG_HELP_DESCRIPTION} = $oCommandDoc->textGet(); $oCommand->{&CONFIG_HELP_INTERNAL} = cfgDefineCommand()->{$strCommand}{&CFGDEF_INTERNAL}; } # Iterate through all options my $oOptionDefine = cfgDefine(); foreach my $strOption (sort(keys(%{$oOptionDefine}))) { # Iterate through all commands my @stryCommandList = sort(keys(%{defined($$oOptionDefine{$strOption}{&CFGDEF_COMMAND}) ? $$oOptionDefine{$strOption}{&CFGDEF_COMMAND} : $$oConfigHash{&CONFIG_HELP_COMMAND}})); foreach my $strCommand (@stryCommandList) { if (!defined($$oConfigHash{&CONFIG_HELP_COMMAND}{$strCommand})) { next; } # Skip the option if it is not valid for this command and the default role. Only options valid for the default role are # show in help because that is the only role available to a user. if (!defined($oOptionDefine->{$strOption}{&CFGDEF_COMMAND}{$strCommand}{&CFGDEF_COMMAND_ROLE}{&CFGCMD_ROLE_MAIN})) { next; } my $oCommandDoc = $oDoc->nodeGet('operation')->nodeGet('command-list')->nodeGetById('command', $strCommand); # First check if the option is documented in the command my $oOptionDoc; my $strOptionSource; my $oCommandOptionList = $oCommandDoc->nodeGet('option-list', false); if (defined($oCommandOptionList)) { $oOptionDoc = $oCommandOptionList->nodeGetById('option', $strOption, false); $strOptionSource = CONFIG_HELP_SOURCE_COMMAND if (defined($oOptionDoc)); } # If the option wasn't found keep looking my $strSection; if (!defined($oOptionDoc)) { # Next see if it's documented in the section if (defined($$oOptionDefine{$strOption}{&CFGDEF_SECTION})) { # &log(INFO, " trying section ${strSection}"); foreach my $oSectionNode ($oDoc->nodeGet('config')->nodeGet('config-section-list')->nodeList()) { my $oOptionDocCheck = $oSectionNode->nodeGetById('config-key-list') ->nodeGetById('config-key', $strOption, false); if ($oOptionDocCheck) { if (defined($oOptionDoc)) { confess 'option exists in more than one section'; } $oOptionDoc = $oOptionDocCheck; $strOptionSource = CONFIG_HELP_SOURCE_SECTION; $strSection = $oSectionNode->paramGet('id'); } } } # If no section is defined then look in the default command option list else { $oOptionDoc = $oDoc->nodeGet('operation')->nodeGet('operation-general')->nodeGet('option-list') ->nodeGetById('option', $strOption, false); $strOptionSource = CONFIG_HELP_SOURCE_DEFAULT if (defined($oOptionDoc)); # If a section is specified then use it, otherwise the option should be general since it is not for a specific # command if (defined($oOptionDoc)) { $strSection = $oOptionDoc->paramGet('section', false); if (!defined($strSection)) { $strSection = "general"; } } } } # If the option wasn't found then error if (!defined($oOptionDoc)) { confess &log(ERROR, "unable to find option '${strOption}' for command '${strCommand}'") } # if the option is documented in the command then it should be accessible from the command line only. if (!defined($strSection)) { if (defined($$oOptionDefine{$strOption}{&CFGDEF_SECTION})) { &log(ERROR, "option ${strOption} defined in command ${strCommand} must not have " . CFGDEF_SECTION . " defined"); } } # Store the option in the command $$oConfigHash{&CONFIG_HELP_COMMAND}{$strCommand}{&CONFIG_HELP_OPTION}{$strOption}{&CONFIG_HELP_SOURCE} = $strOptionSource; my $oCommandOption = $$oConfigHash{&CONFIG_HELP_COMMAND}{$strCommand}{&CONFIG_HELP_OPTION}{$strOption}; $$oCommandOption{&CONFIG_HELP_SUMMARY} = $oOptionDoc->nodeGet('summary')->textGet(); $$oCommandOption{&CONFIG_HELP_DESCRIPTION} = $oOptionDoc->textGet(); $$oCommandOption{&CONFIG_HELP_EXAMPLE} = $oOptionDoc->fieldGet('example'); $oCommandOption->{&CONFIG_HELP_INTERNAL} = cfgDefineCommand()->{$strCommand}{&CFGDEF_INTERNAL} ? true : $oOptionDefine->{$strOption}{&CFGDEF_INTERNAL}; $$oCommandOption{&CONFIG_HELP_NAME} = $oOptionDoc->paramGet('name'); # Generate a list of alternate names if (defined($rhConfigDefine->{$strOption}{&CFGDEF_DEPRECATE})) { my $rhNameAlt = {}; foreach my $strNameAlt (sort(keys(%{$rhConfigDefine->{$strOption}{&CFGDEF_DEPRECATE}}))) { $strNameAlt =~ s/\?//g; if ($strNameAlt ne $strOption) { $rhNameAlt->{$strNameAlt} = true; } } my @stryNameAlt = sort(keys(%{$rhNameAlt})); if (@stryNameAlt > 0) { $oCommandOption->{&CONFIG_HELP_NAME_ALT} = \@stryNameAlt; } } # If the option did not come from the command also store in global option list. This prevents duplication of commonly # used options. if ($strOptionSource ne CONFIG_HELP_SOURCE_COMMAND) { $$oConfigHash{&CONFIG_HELP_OPTION}{$strOption}{&CONFIG_HELP_SUMMARY} = $$oCommandOption{&CONFIG_HELP_SUMMARY}; my $oOption = $$oConfigHash{&CONFIG_HELP_OPTION}{$strOption}; if (defined($strSection)) { $$oOption{&CONFIG_HELP_SECTION} = $strSection; } $$oOption{&CONFIG_HELP_NAME} = $oOptionDoc->paramGet('name'); $oOption->{&CONFIG_HELP_NAME_ALT} = $oCommandOption->{&CONFIG_HELP_NAME_ALT}; $$oOption{&CONFIG_HELP_DESCRIPTION} = $$oCommandOption{&CONFIG_HELP_DESCRIPTION}; $$oOption{&CONFIG_HELP_EXAMPLE} = $oOptionDoc->fieldGet('example'); $oOption->{&CONFIG_HELP_INTERNAL} = $oOptionDefine->{$strOption}{&CFGDEF_INTERNAL}; } } } # Store the config hash $self->{oConfigHash} = $oConfigHash; # Return from function and log return values if any logDebugReturn($strOperation); } #################################################################################################################################### # manGet # # Generate the man page. #################################################################################################################################### sub manGet { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oManifest ) = logDebugParam ( __PACKAGE__ . '->manGet', \@_, {name => 'oManifest'} ); # Get index.xml to pull various text from my $oIndexDoc = ${$oManifest->sourceGet('index')}{doc}; # Write the header my $strManPage = "NAME\n" . ' ' . PROJECT_NAME . ' - ' . $oManifest->variableReplace($oIndexDoc->paramGet('subtitle')) . "\n\n" . "SYNOPSIS\n" . ' ' . PROJECT_EXE . ' [options] [command]'; # Output the description (first two paragraphs of index.xml introduction) my $iParaTotal = 0; $strManPage .= "\n\n" . "DESCRIPTION"; foreach my $oPara ($oIndexDoc->nodeGetById('section', 'introduction')->nodeList('p')) { $strManPage .= ($iParaTotal == 0 ? "\n" : "\n\n") . ' ' . manGetFormatText($oManifest->variableReplace($self->{oDocRender}->processText($oPara->textGet())), 80, 2); last; } # Build command and config hashes my $hConfigDefine = cfgDefine(); my $hConfig = $self->{oConfigHash}; my $hCommandList = {}; my $iCommandMaxLen = 0; my $hOptionList = {}; my $iOptionMaxLen = 0; foreach my $strCommand (sort(keys(%{$$hConfig{&CONFIG_HELP_COMMAND}}))) { # Skip internal commands next if $hConfig->{&CONFIG_HELP_COMMAND}{$strCommand}{&CONFIG_HELP_INTERNAL}; my $hCommand = $$hConfig{&CONFIG_HELP_COMMAND}{$strCommand}; $iCommandMaxLen = length($strCommand) > $iCommandMaxLen ? length($strCommand) : $iCommandMaxLen; $$hCommandList{$strCommand}{summary} = $$hCommand{&CONFIG_HELP_SUMMARY}; if (defined($$hCommand{&CONFIG_HELP_OPTION})) { foreach my $strOption (sort(keys(%{$$hCommand{&CONFIG_HELP_OPTION}}))) { my $hOption = $$hCommand{&CONFIG_HELP_OPTION}{$strOption}; if ($$hOption{&CONFIG_HELP_SOURCE} eq CONFIG_HELP_SOURCE_COMMAND) { $iOptionMaxLen = length($strOption) > $iOptionMaxLen ? length($strOption) : $iOptionMaxLen; $$hOptionList{$strCommand}{$strOption}{&CONFIG_HELP_SUMMARY} = $$hOption{&CONFIG_HELP_SUMMARY}; } } } } foreach my $strOption (sort(keys(%{$$hConfig{&CONFIG_HELP_OPTION}}))) { # Skip internal options next if $hConfig->{&CONFIG_HELP_OPTION}{$strOption}{&CONFIG_HELP_INTERNAL}; my $hOption = $$hConfig{&CONFIG_HELP_OPTION}{$strOption}; $iOptionMaxLen = length($strOption) > $iOptionMaxLen ? length($strOption) : $iOptionMaxLen; my $strSection = defined($$hOption{&CONFIG_HELP_SECTION}) ? $$hOption{&CONFIG_HELP_SECTION} : CFGDEF_GENERAL; $$hOptionList{$strSection}{$strOption}{&CONFIG_HELP_SUMMARY} = $$hOption{&CONFIG_HELP_SUMMARY}; } # Output Commands $strManPage .= "\n\n" . 'COMMANDS'; foreach my $strCommand (sort(keys(%{$hCommandList}))) { # Construct the summary my $strSummary = $oManifest->variableReplace($self->{oDocRender}->processText($$hCommandList{$strCommand}{summary})); # $strSummary = lcfirst(substr($strSummary, 0, length($strSummary) - 1)); # Output the summary $strManPage .= "\n " . "${strCommand}" . (' ' x ($iCommandMaxLen - length($strCommand))) . ' ' . manGetFormatText($strSummary, 80, $iCommandMaxLen + 4); } # Output options my $bFirst = true; $strManPage .= "\n\n" . 'OPTIONS'; foreach my $strSection (sort(keys(%{$hOptionList}))) { $strManPage .= ($bFirst ?'' : "\n") . "\n " . ucfirst($strSection) . ' Options:'; foreach my $strOption (sort(keys(%{$$hOptionList{$strSection}}))) { my $hOption = $$hOptionList{$strSection}{$strOption}; # Construct the default my $strCommand = grep(/$strSection/i, cfgDefineCommandList()) ? $strSection : undef; my $strDefault = docConfigOptionDefault($strOption, $strCommand); if (defined($strDefault)) { if ($strOption eq CFGOPT_REPO_HOST_CMD || $strOption eq CFGOPT_PG_HOST_CMD) { $strDefault = PROJECT_EXE; } elsif ($$hConfigDefine{$strOption}{&CFGDEF_TYPE} eq &CFGDEF_TYPE_BOOLEAN) { $strDefault = $strDefault ? 'y' : 'n'; } } # # use Data::Dumper; confess Dumper($$hOption{&CONFIG_HELP_SUMMARY}); # Construct the summary my $strSummary = $oManifest->variableReplace($self->{oDocRender}->processText($$hOption{&CONFIG_HELP_SUMMARY})); $strSummary = $strSummary . (defined($strDefault) ? " [default=${strDefault}]" : ''); # Output the summary $strManPage .= "\n " . "--${strOption}" . (' ' x ($iOptionMaxLen - length($strOption))) . ' ' . manGetFormatText($strSummary, 80, $iOptionMaxLen + 8); } $bFirst = false; } # Write files, examples, and references $strManPage .= "\n\n" . "FILES\n" . "\n" . ' ' . docConfigOptionDefault(CFGOPT_CONFIG) . "\n" . ' ' . docConfigOptionDefault(CFGOPT_REPO_PATH) . "\n" . ' ' . docConfigOptionDefault(CFGOPT_LOG_PATH) . "\n" . ' ' . docConfigOptionDefault(CFGOPT_SPOOL_PATH) . "\n" . ' ' . docConfigOptionDefault(CFGOPT_LOCK_PATH) . "\n" . "\n" . "EXAMPLES\n" . "\n" . " * Create a backup of the PostgreSQL `main` cluster:\n" . "\n" . ' $ ' . PROJECT_EXE . ' --' . CFGOPT_STANZA . "=main backup\n" . "\n" . ' The `main` cluster should be configured in `' . docConfigOptionDefault(CFGOPT_CONFIG) . "`\n" . "\n" . " * Show all available backups:\n" . "\n" . ' $ ' . PROJECT_EXE . ' ' . CFGCMD_INFO . "\n" . "\n" . " * Show all available backups for a specific cluster:\n" . "\n" . ' $ ' . PROJECT_EXE . ' --' . CFGOPT_STANZA . '=main ' . CFGCMD_INFO . "\n" . "\n" . " * Show backup specific options:\n" . "\n" . ' $ ' . PROJECT_EXE . ' ' . CFGCMD_HELP . ' ' . CFGCMD_BACKUP . "\n" . "\n" . "SEE ALSO\n" . "\n" . ' /usr/share/doc/' . PROJECT_EXE . "-doc/html/index.html\n" . ' ' . $oManifest->variableReplace('{[backrest-url-base]}') . "\n"; return $strManPage; } # Helper function for manGet() used to format text by indenting and splitting sub manGetFormatText { my $strLine = shift; my $iLength = shift; my $iIndentRest = shift; my $strPart; my $strResult; my $bFirst = true; do { my $iIndent = $bFirst ? 0 : $iIndentRest; ($strPart, $strLine) = stringSplit($strLine, ' ', $iLength - $iIndentRest); $strResult .= ($bFirst ? '' : "\n") . (' ' x $iIndent) . trim($strPart); $bFirst = false; } while (defined($strLine)); return $strResult; } #################################################################################################################################### # helpConfigDocGet # # Get the xml for configuration help. #################################################################################################################################### sub helpConfigDocGet { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->helpConfigDocGet'); # Build a hash of the sections my $oConfigHash = $self->{oConfigHash}; my $oConfigDoc = $self->{oDoc}->nodeGet('config'); my $oSectionHash = {}; foreach my $strOption (sort(keys(%{$$oConfigHash{&CONFIG_HELP_OPTION}}))) { my $oOption = $$oConfigHash{&CONFIG_HELP_OPTION}{$strOption}; if (defined($$oOption{&CONFIG_HELP_SECTION})) { $$oSectionHash{$$oOption{&CONFIG_HELP_SECTION}}{$strOption} = true; } } my $oDoc = new pgBackRestDoc::Common::Doc(); $oDoc->paramSet('title', $oConfigDoc->paramGet('title')); # set the description for use as a meta tag $oDoc->fieldSet('description', $oConfigDoc->fieldGet('description')); # Output the introduction my $oIntroSectionDoc = $oDoc->nodeAdd('section', undef, {id => 'introduction'}); $oIntroSectionDoc->nodeAdd('title')->textSet('Introduction'); $oIntroSectionDoc->textSet($oConfigDoc->textGet()); foreach my $strSection (sort(keys(%{$oSectionHash}))) { my $oSectionElement = $oDoc->nodeAdd('section', undef, {id => "section-${strSection}"}); my $oSectionDoc = $oConfigDoc->nodeGet('config-section-list')->nodeGetById('config-section', $strSection); # Set the summary text for the section $oSectionElement->textSet($oSectionDoc->textGet()); $oSectionElement-> nodeAdd('title')->textSet( {name => 'text', children=> [$oSectionDoc->paramGet('name') . ' Options (', {name => 'id', value => $strSection}, ')']}); foreach my $strOption (sort(keys(%{$$oSectionHash{$strSection}}))) { # Skip internal options next if $oConfigHash->{&CONFIG_HELP_OPTION}{$strOption}{&CONFIG_HELP_INTERNAL}; $self->helpOptionGet(undef, $strOption, $oSectionElement, $$oConfigHash{&CONFIG_HELP_OPTION}{$strOption}); } } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oDoc', value => $oDoc} ); } #################################################################################################################################### # helpCommandDocGet # # Get the xml for command help. #################################################################################################################################### sub helpCommandDocGet { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->helpCommandDocGet'); # Working variables my $oConfigHash = $self->{oConfigHash}; my $oOperationDoc = $self->{oDoc}->nodeGet('operation'); my $oOptionDefine = cfgDefine(); my $oDoc = new pgBackRestDoc::Common::Doc(); $oDoc->paramSet('title', $oOperationDoc->paramGet('title')); # set the description for use as a meta tag $oDoc->fieldSet('description', $oOperationDoc->fieldGet('description')); # Output the introduction my $oIntroSectionDoc = $oDoc->nodeAdd('section', undef, {id => 'introduction'}); $oIntroSectionDoc->nodeAdd('title')->textSet('Introduction'); $oIntroSectionDoc->textSet($oOperationDoc->textGet()); foreach my $strCommand (sort(keys(%{$$oConfigHash{&CONFIG_HELP_COMMAND}}))) { # Skip internal commands next if $oConfigHash->{&CONFIG_HELP_COMMAND}{$strCommand}{&CONFIG_HELP_INTERNAL}; my $oCommandHash = $$oConfigHash{&CONFIG_HELP_COMMAND}{$strCommand}; my $oSectionElement = $oDoc->nodeAdd('section', undef, {id => "command-${strCommand}"}); my $oCommandDoc = $oOperationDoc->nodeGet('command-list')->nodeGetById('command', $strCommand); $oSectionElement-> nodeAdd('title')->textSet( {name => 'text', children=> [$oCommandDoc->paramGet('name') . ' Command (', {name => 'id', value => $strCommand}, ')']}); $oSectionElement->textSet($$oCommandHash{&CONFIG_HELP_DESCRIPTION}); # use Data::doc; # confess Dumper($oDoc->{oDoc}); if (defined($$oCommandHash{&CONFIG_HELP_OPTION})) { my $oCategory = {}; foreach my $strOption (sort(keys(%{$$oCommandHash{&CONFIG_HELP_OPTION}}))) { # Skip internal options next if $rhConfigDefine->{$strOption}{&CFGDEF_INTERNAL}; # Skip secure options that can't be defined on the command line next if ($rhConfigDefine->{$strOption}{&CFGDEF_SECURE}); my ($oOption, $strCategory) = helpCommandDocGetOptionFind($oConfigHash, $oOptionDefine, $strCommand, $strOption); $$oCategory{$strCategory}{$strOption} = $oOption; } # Iterate sections foreach my $strCategory (sort(keys(%{$oCategory}))) { my $oOptionListElement = $oSectionElement->nodeAdd( 'section', undef, {id => "category-${strCategory}", toc => 'n'}); $oOptionListElement-> nodeAdd('title')->textSet(ucfirst($strCategory) . ' Options'); # Iterate options foreach my $strOption (sort(keys(%{$$oCategory{$strCategory}}))) { $self->helpOptionGet($strCommand, $strOption, $oOptionListElement, $$oCommandHash{&CONFIG_HELP_OPTION}{$strOption}); } } } } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oDoc', value => $oDoc} ); } # Helper function for helpCommandDocGet() to find options. The option may be stored with the command or in the option list depending # on whether it's generic or command-specific sub helpCommandDocGetOptionFind { my $oConfigHelpData = shift; my $oOptionDefine = shift; my $strCommand = shift; my $strOption = shift; # Get section from the option my $strSection = $oConfigHelpData->{&CONFIG_HELP_OPTION}{$strOption}{&CONFIG_HELP_SECTION}; # Get option from the command to start my $oOption = $$oConfigHelpData{&CONFIG_HELP_COMMAND}{$strCommand}{&CONFIG_HELP_OPTION}{$strOption}; # If the option has a section (i.e. not command-line only) then it comes from the standard option reference if ($$oOption{&CONFIG_HELP_SOURCE} eq CONFIG_HELP_SOURCE_SECTION) { $oOption = $$oConfigHelpData{&CONFIG_HELP_OPTION}{$strOption}; } # Reduce the sections that are shown in the command help. This is the same logic as help.c. if (!defined($strSection) || ($strSection ne 'general' && $strSection ne 'log' && $strSection ne 'repository' && $strSection ne 'stanza')) { $strSection = 'command'; } return $oOption, $strSection; } #################################################################################################################################### # helpOptionGet # # Get the xml for an option. #################################################################################################################################### sub helpOptionGet { my $self = shift; my $strCommand = shift; my $strOption = shift; my $oParentElement = shift; my $oOptionHash = shift; # Create the option section my $oOptionElement = $oParentElement->nodeAdd( 'section', undef, {id => "option-${strOption}", toc => defined($strCommand) ? 'n' : 'y'}); # Set the option section title $oOptionElement-> nodeAdd('title')->textSet( {name => 'text', children=> [$$oOptionHash{&CONFIG_HELP_NAME} . ' Option (', {name => 'id', value => "--${strOption}"}, ')']}); # Add the option summary and description $oOptionElement-> nodeAdd('p')->textSet($$oOptionHash{&CONFIG_HELP_SUMMARY}); $oOptionElement-> nodeAdd('p')->textSet($$oOptionHash{&CONFIG_HELP_DESCRIPTION}); # Get the default value (or required=n if there is no default) my $strCodeBlock; if (defined(docConfigOptionDefault($strOption, $strCommand))) { my $strDefault; if ($strOption eq CFGOPT_REPO_HOST_CMD || $strOption eq CFGOPT_PG_HOST_CMD) { $strDefault = '[INSTALL-PATH]/' . PROJECT_EXE; } else { if (docConfigOptionTypeTest($strOption, CFGDEF_TYPE_BOOLEAN)) { $strDefault = docConfigOptionDefault($strOption, $strCommand) ? 'y' : 'n'; } else { $strDefault = docConfigOptionDefault($strOption, $strCommand); } } $strCodeBlock = "default: ${strDefault}"; } # This won't work correctly until there is some notion of dependency # elsif (optionRequired($strOption, $strCommand)) # { # $strCodeBlock = 'required: y'; # } # Get the allowed range if it exists my ($strRangeMin, $strRangeMax) = docConfigOptionRange($strOption, $strCommand); if (defined($strRangeMin)) { $strCodeBlock .= (defined($strCodeBlock) ? "\n" : '') . "allowed: ${strRangeMin}-${strRangeMax}"; } # Get the example my $strExample; my $strOptionPrefix = $rhConfigDefine->{$strOption}{&CFGDEF_GROUP}; my $strOptionIndex = defined($strOptionPrefix) ? "${strOptionPrefix}1-" . substr($strOption, length($strOptionPrefix) + 1) : $strOption; if (defined($strCommand)) { if (docConfigOptionTypeTest($strOption, CFGDEF_TYPE_BOOLEAN)) { if ($$oOptionHash{&CONFIG_HELP_EXAMPLE} ne 'n' && $$oOptionHash{&CONFIG_HELP_EXAMPLE} ne 'y') { confess &log(ERROR, "option ${strOption} example should be boolean but value is: " . $$oOptionHash{&CONFIG_HELP_EXAMPLE}); } $strExample = '--' . ($$oOptionHash{&CONFIG_HELP_EXAMPLE} eq 'n' ? 'no-' : '') . $strOptionIndex; } else { $strExample = "--${strOptionIndex}=" . $$oOptionHash{&CONFIG_HELP_EXAMPLE}; } } else { $strExample = "${strOptionIndex}=" . $$oOptionHash{&CONFIG_HELP_EXAMPLE}; } $strCodeBlock .= (defined($strCodeBlock) ? "\n" : '') . "example: ${strExample}"; $oOptionElement-> nodeAdd('code-block')->valueSet($strCodeBlock); # Output deprecated names if (defined($oOptionHash->{&CONFIG_HELP_NAME_ALT})) { my $strCaption = 'Deprecated Name' . (@{$oOptionHash->{&CONFIG_HELP_NAME_ALT}} > 1 ? 's' : ''); $oOptionElement-> nodeAdd('p')->textSet("${strCaption}: " . join(', ', @{$oOptionHash->{&CONFIG_HELP_NAME_ALT}})); } } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Common/DocExecute.pm000066400000000000000000001132641416457663300250420ustar00rootroot00000000000000#################################################################################################################################### # DOC EXECUTE MODULE #################################################################################################################################### package pgBackRestDoc::Common::DocExecute; use parent 'pgBackRestDoc::Common::DocRender'; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use English '-no_match_vars'; use Cwd qw(abs_path); use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname); use Storable qw(dclone); use pgBackRestTest::Common::ExecuteTest; use pgBackRestTest::Common::HostTest; use pgBackRestTest::Common::HostGroupTest; use pgBackRestDoc::Common::DocManifest; use pgBackRestDoc::Common::Exception; use pgBackRestDoc::Common::Ini; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; use pgBackRestDoc::Custom::DocConfigData; use pgBackRestDoc::ProjectInfo; #################################################################################################################################### # User that's building the docs #################################################################################################################################### use constant DOC_USER => getpwuid($UID) eq 'root' ? 'ubuntu' : getpwuid($UID) . ''; #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Assign function parameters, defaults, and log debug info my ( $strOperation, $strType, $oManifest, $strRenderOutKey, $bExe ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'strType'}, {name => 'oManifest'}, {name => 'strRenderOutKey'}, {name => 'bExe'} ); # Create the class hash my $self = $class->SUPER::new($strType, $oManifest, $bExe, $strRenderOutKey); bless $self, $class; if (defined($self->{oSource}{hyCache})) { $self->{bCache} = true; $self->{iCacheIdx} = 0; } else { $self->{bCache} = false; } $self->{bExe} = $bExe; $self->{iCmdLineLen} = $self->{oDoc}->paramGet('cmd-line-len', false, 80); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # executeKey # # Get a unique key for the execution step to determine if the cache is valid. #################################################################################################################################### sub executeKey { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strHostName, $oCommand, ) = logDebugParam ( __PACKAGE__ . '->executeKey', \@_, {name => 'strHostName', trace => true}, {name => 'oCommand', trace => true}, ); # Add user to command my $bUserForce = $oCommand->paramTest('user-force', 'y') ? true : false; my $strCommand = $self->{oManifest}->variableReplace(trim($oCommand->fieldGet('exe-cmd'))); my $strUser = $self->{oManifest}->variableReplace($oCommand->paramGet('user', false, DOC_USER)); $strCommand = ($strUser eq DOC_USER || $bUserForce ? '' : ('sudo ' . ($strUser eq 'root' ? '' : "-u $strUser "))) . $strCommand; # Format and split command $strCommand =~ s/[ ]*\n[ ]*/ \\\n /smg; $strCommand =~ s/ \\\@ \\//smg; my @stryCommand = split("\n", $strCommand); my $hCacheKey = { host => $strHostName, cmd => \@stryCommand, output => JSON::PP::false, }; $$hCacheKey{'run-as-user'} = $bUserForce ? $strUser : undef; if (defined($oCommand->fieldGet('exe-cmd-extra', false))) { $$hCacheKey{'cmd-extra'} = $self->{oManifest}->variableReplace($oCommand->fieldGet('exe-cmd-extra')); } if (defined($oCommand->paramGet('err-expect', false))) { $$hCacheKey{'err-expect'} = $oCommand->paramGet('err-expect'); } if ($oCommand->paramTest('output', 'y') || $oCommand->paramTest('show', 'y') || $oCommand->paramTest('variable-key')) { $$hCacheKey{'output'} = JSON::PP::true; } $$hCacheKey{'load-env'} = $oCommand->paramTest('load-env', 'n') ? JSON::PP::false : JSON::PP::true; $$hCacheKey{'bash-wrap'} = $oCommand->paramTest('bash-wrap', 'n') ? JSON::PP::false : JSON::PP::true; if (defined($oCommand->fieldGet('exe-highlight', false))) { $$hCacheKey{'output'} = JSON::PP::true; $$hCacheKey{highlight}{'filter'} = $oCommand->paramTest('filter', 'n') ? JSON::PP::false : JSON::PP::true; $$hCacheKey{highlight}{'filter-context'} = $oCommand->paramGet('filter-context', false, 2); my @stryHighlight; $stryHighlight[0] = $self->{oManifest}->variableReplace($oCommand->fieldGet('exe-highlight')); $$hCacheKey{highlight}{list} = \@stryHighlight; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'hExecuteKey', value => $hCacheKey, trace => true} ); } #################################################################################################################################### # execute #################################################################################################################################### sub execute { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oSection, $strHostName, $oCommand, $iIndent, $bCache, $bShow, ) = logDebugParam ( __PACKAGE__ . '->execute', \@_, {name => 'oSection'}, {name => 'strHostName'}, {name => 'oCommand'}, {name => 'iIndent', optional => true, default => 1}, {name => 'bCache', optional => true, default => true}, {name => 'bShow', optional => true, default => true}, ); # Working variables my $hCacheKey = $self->executeKey($strHostName, $oCommand); my $strCommand = join("\n", @{$$hCacheKey{cmd}}); my $strOutput; if ($bShow && $self->{bExe} && $self->isRequired($oSection)) { # Make sure that no lines are greater than 80 chars foreach my $strLine (split("\n", $strCommand)) { if (length(trim($strLine)) > $self->{iCmdLineLen}) { confess &log(ERROR, "command has a line > $self->{iCmdLineLen} characters:\n${strCommand}\noffending line: ${strLine}"); } } } &log(DEBUG, (' ' x $iIndent) . "execute: $strCommand"); if ($self->{oManifest}->variableReplace($oCommand->paramGet('skip', false, 'n')) ne 'y') { if ($self->{bExe} && $self->isRequired($oSection)) { my ($bCacheHit, $strCacheType, $hCacheKey, $hCacheValue) = $self->cachePop('exe', $hCacheKey); if ($bCacheHit) { $strOutput = defined($$hCacheValue{output}) ? join("\n", @{$$hCacheValue{output}}) : undef; } else { # Check that the host is valid my $oHost = $self->{host}{$strHostName}; if (!defined($oHost)) { confess &log(ERROR, "cannot execute on host ${strHostName} because the host does not exist"); } my $oExec = $oHost->execute( $strCommand . (defined($$hCacheKey{'cmd-extra'}) ? ' ' . $$hCacheKey{'cmd-extra'} : ''), {iExpectedExitStatus => $$hCacheKey{'err-expect'}, bSuppressError => $oCommand->paramTest('err-suppress', 'y'), iRetrySeconds => $oCommand->paramGet('retry', false)}, $hCacheKey->{'run-as-user'}, {bLoadEnv => $hCacheKey->{'load-env'}, bBashWrap => $hCacheKey->{'bash-wrap'}}); $oExec->begin(); $oExec->end(); if (defined($oExec->{strOutLog}) && $oExec->{strOutLog} ne '') { $strOutput = $oExec->{strOutLog}; # Trim off extra linefeeds before and after $strOutput =~ s/^\n+|\n$//g; } if (defined($$hCacheKey{'err-expect'}) && defined($oExec->{strErrorLog}) && $oExec->{strErrorLog} ne '') { $strOutput .= $oExec->{strErrorLog}; } if ($$hCacheKey{output} && defined($$hCacheKey{highlight}) && $$hCacheKey{highlight}{filter} && defined($strOutput)) { my $strHighLight = @{$$hCacheKey{highlight}{list}}[0]; if (!defined($strHighLight)) { confess &log(ERROR, 'filter requires highlight definition: ' . $strCommand); } my $iFilterContext = $$hCacheKey{highlight}{'filter-context'}; my @stryOutput = split("\n", $strOutput); undef($strOutput); # my $iFiltered = 0; my $iLastOutput = -1; for (my $iIndex = 0; $iIndex < @stryOutput; $iIndex++) { if ($stryOutput[$iIndex] =~ /$strHighLight/) { # Determine the first line to output my $iFilterFirst = $iIndex - $iFilterContext; # Don't go past the beginning $iFilterFirst = $iFilterFirst < 0 ? 0 : $iFilterFirst; # Don't repeat lines that have already been output $iFilterFirst = $iFilterFirst <= $iLastOutput ? $iLastOutput + 1 : $iFilterFirst; # Determine the last line to output my $iFilterLast = $iIndex + $iFilterContext; # Don't got past the end $iFilterLast = $iFilterLast >= @stryOutput ? @stryOutput -1 : $iFilterLast; # Mark filtered lines if any if ($iFilterFirst > $iLastOutput + 1) { my $iFiltered = $iFilterFirst - ($iLastOutput + 1); if ($iFiltered > 1) { $strOutput .= (defined($strOutput) ? "\n" : '') . " [filtered ${iFiltered} lines of output]"; } else { $iFilterFirst -= 1; } } # Output the lines for (my $iOutputIndex = $iFilterFirst; $iOutputIndex <= $iFilterLast; $iOutputIndex++) { $strOutput .= (defined($strOutput) ? "\n" : '') . $stryOutput[$iOutputIndex]; } $iLastOutput = $iFilterLast; } } if (@stryOutput - 1 > $iLastOutput + 1) { my $iFiltered = (@stryOutput - 1) - ($iLastOutput + 1); if ($iFiltered > 1) { $strOutput .= (defined($strOutput) ? "\n" : '') . " [filtered ${iFiltered} lines of output]"; } else { $strOutput .= (defined($strOutput) ? "\n" : '') . $stryOutput[-1]; } } } if (!$$hCacheKey{output}) { $strOutput = undef; } if (defined($strOutput)) { my @stryOutput = split("\n", $strOutput); $$hCacheValue{output} = \@stryOutput; } if ($bCache) { $self->cachePush($strCacheType, $hCacheKey, $hCacheValue); } } # Output is assigned to a var if ($oCommand->paramTest('variable-key')) { $self->{oManifest}->variableSet($oCommand->paramGet('variable-key'), trim($strOutput), true); } } elsif ($$hCacheKey{output}) { $strOutput = 'Output suppressed for testing'; } } # Default variable output when it was not set by execution if ($oCommand->paramTest('variable-key') && !defined($self->{oManifest}->variableGet($oCommand->paramGet('variable-key')))) { $self->{oManifest}->variableSet($oCommand->paramGet('variable-key'), '[Test Variable]', true); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strCommand', value => $strCommand, trace => true}, {name => 'strOutput', value => $strOutput, trace => true} ); } #################################################################################################################################### # configKey #################################################################################################################################### sub configKey { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oConfig, ) = logDebugParam ( __PACKAGE__ . '->hostKey', \@_, {name => 'oConfig', trace => true}, ); my $hCacheKey = { host => $self->{oManifest}->variableReplace($oConfig->paramGet('host')), file => $self->{oManifest}->variableReplace($oConfig->paramGet('file')), }; if ($oConfig->paramTest('reset', 'y')) { $$hCacheKey{reset} = JSON::PP::true; } # Add all options to the key my $strOptionTag = $oConfig->nameGet() eq 'backrest-config' ? 'backrest-config-option' : 'postgres-config-option'; foreach my $oOption ($oConfig->nodeList($strOptionTag)) { my $hOption = {}; if ($oOption->paramTest('remove', 'y')) { $$hOption{remove} = JSON::PP::true; } if (defined($oOption->valueGet(false))) { $$hOption{value} = $self->{oManifest}->variableReplace($oOption->valueGet()); } my $strKey = $self->{oManifest}->variableReplace($oOption->paramGet('key')); if ($oConfig->nameGet() eq 'backrest-config') { my $strSection = $self->{oManifest}->variableReplace($oOption->paramGet('section')); $$hCacheKey{option}{$strSection}{$strKey} = $hOption; } else { $$hCacheKey{option}{$strKey} = $hOption; } } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'hCacheKey', value => $hCacheKey, trace => true} ); } #################################################################################################################################### # backrestConfig #################################################################################################################################### sub backrestConfig { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oSection, $oConfig, $iDepth ) = logDebugParam ( __PACKAGE__ . '->backrestConfig', \@_, {name => 'oSection'}, {name => 'oConfig'}, {name => 'iDepth'} ); # Working variables my $hCacheKey = $self->configKey($oConfig); my $strFile = $$hCacheKey{file}; my $strConfig = undef; &log(DEBUG, (' ' x $iDepth) . 'process backrest config: ' . $$hCacheKey{file}); if ($self->{bExe} && $self->isRequired($oSection)) { my ($bCacheHit, $strCacheType, $hCacheKey, $hCacheValue) = $self->cachePop('cfg-' . PROJECT_EXE, $hCacheKey); if ($bCacheHit) { $strConfig = defined($$hCacheValue{config}) ? join("\n", @{$$hCacheValue{config}}) : undef; } else { # Check that the host is valid my $strHostName = $self->{oManifest}->variableReplace($oConfig->paramGet('host')); my $oHost = $self->{host}{$strHostName}; if (!defined($oHost)) { confess &log(ERROR, "cannot configure backrest on host ${strHostName} because the host does not exist"); } # Reset all options if ($oConfig->paramTest('reset', 'y')) { delete(${$self->{config}}{$strHostName}{$$hCacheKey{file}}) } foreach my $oOption ($oConfig->nodeList('backrest-config-option')) { my $strSection = $self->{oManifest}->variableReplace($oOption->paramGet('section')); my $strKey = $self->{oManifest}->variableReplace($oOption->paramGet('key')); my $strValue; if (!$oOption->paramTest('remove', 'y')) { $strValue = $self->{oManifest}->variableReplace(trim($oOption->valueGet(false))); } if (!defined($strValue)) { delete(${$self->{config}}{$strHostName}{$$hCacheKey{file}}{$strSection}{$strKey}); if (keys(%{${$self->{config}}{$strHostName}{$$hCacheKey{file}}{$strSection}}) == 0) { delete(${$self->{config}}{$strHostName}{$$hCacheKey{file}}{$strSection}); } &log(DEBUG, (' ' x ($iDepth + 1)) . "reset ${strSection}->${strKey}"); } else { # If this option is a hash and the value is already set then append to the array if (defined(cfgDefine()->{$strKey}) && cfgDefine()->{$strKey}{&CFGDEF_TYPE} eq CFGDEF_TYPE_HASH && defined(${$self->{config}}{$strHostName}{$$hCacheKey{file}}{$strSection}{$strKey})) { my @oValue = (); my $strHashValue = ${$self->{config}}{$strHostName}{$$hCacheKey{file}}{$strSection}{$strKey}; # If there is only one key/value if (ref(\$strHashValue) eq 'SCALAR') { push(@oValue, $strHashValue); } # Else if there is an array of values else { @oValue = @{$strHashValue}; } push(@oValue, $strValue); ${$self->{config}}{$strHostName}{$$hCacheKey{file}}{$strSection}{$strKey} = \@oValue; } # else just set the value else { ${$self->{config}}{$strHostName}{$$hCacheKey{file}}{$strSection}{$strKey} = $strValue; } &log(DEBUG, (' ' x ($iDepth + 1)) . "set ${strSection}->${strKey} = ${strValue}"); } } my $strLocalFile = '/home/' . DOC_USER . '/data/pgbackrest.conf'; # Save the ini file $self->{oManifest}->storage()->put($strLocalFile, iniRender($self->{config}{$strHostName}{$$hCacheKey{file}}, true)); $oHost->copyTo( $strLocalFile, $$hCacheKey{file}, $self->{oManifest}->variableReplace($oConfig->paramGet('owner', false, 'postgres:postgres')), '640'); # Remove the log-console-stderr option before pushing into the cache # ??? This is not very pretty and should be replaced with a general way to hide config options my $oConfigClean = dclone($self->{config}{$strHostName}{$$hCacheKey{file}}); delete($$oConfigClean{&CFGDEF_SECTION_GLOBAL}{&CFGOPT_LOG_LEVEL_STDERR}); delete($$oConfigClean{&CFGDEF_SECTION_GLOBAL}{&CFGOPT_LOG_TIMESTAMP}); if (keys(%{$$oConfigClean{&CFGDEF_SECTION_GLOBAL}}) == 0) { delete($$oConfigClean{&CFGDEF_SECTION_GLOBAL}); } $self->{oManifest}->storage()->put("${strLocalFile}.clean", iniRender($oConfigClean, true)); # Push config file into the cache $strConfig = ${$self->{oManifest}->storage()->get("${strLocalFile}.clean")}; my @stryConfig = undef; if (trim($strConfig) ne '') { @stryConfig = split("\n", $strConfig); } $$hCacheValue{config} = \@stryConfig; $self->cachePush($strCacheType, $hCacheKey, $hCacheValue); } } else { $strConfig = 'Config suppressed for testing'; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strFile', value => $strFile, trace => true}, {name => 'strConfig', value => $strConfig, trace => true}, {name => 'bShow', value => $oConfig->paramTest('show', 'n') ? false : true, trace => true} ); } #################################################################################################################################### # postgresConfig #################################################################################################################################### sub postgresConfig { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oSection, $oConfig, $iDepth ) = logDebugParam ( __PACKAGE__ . '->postgresConfig', \@_, {name => 'oSection'}, {name => 'oConfig'}, {name => 'iDepth'} ); # Working variables my $hCacheKey = $self->configKey($oConfig); my $strFile = $$hCacheKey{file}; my $strConfig; if ($self->{bExe} && $self->isRequired($oSection)) { my ($bCacheHit, $strCacheType, $hCacheKey, $hCacheValue) = $self->cachePop('cfg-postgresql', $hCacheKey); if ($bCacheHit) { $strConfig = defined($$hCacheValue{config}) ? join("\n", @{$$hCacheValue{config}}) : undef; } else { # Check that the host is valid my $strHostName = $self->{oManifest}->variableReplace($oConfig->paramGet('host')); my $oHost = $self->{host}{$strHostName}; if (!defined($oHost)) { confess &log(ERROR, "cannot configure postgres on host ${strHostName} because the host does not exist"); } my $strLocalFile = '/home/' . DOC_USER . '/data/postgresql.conf'; $oHost->copyFrom($$hCacheKey{file}, $strLocalFile); if (!defined(${$self->{'pg-config'}}{$strHostName}{$$hCacheKey{file}}{base}) && $self->{bExe}) { ${$self->{'pg-config'}}{$strHostName}{$$hCacheKey{file}}{base} = ${$self->{oManifest}->storage()->get($strLocalFile)}; } my $oConfigHash = $self->{'pg-config'}{$strHostName}{$$hCacheKey{file}}; my $oConfigHashNew; if (!defined($$oConfigHash{old})) { $oConfigHashNew = {}; $$oConfigHash{old} = {} } else { $oConfigHashNew = dclone($$oConfigHash{old}); } &log(DEBUG, (' ' x $iDepth) . 'process postgres config: ' . $$hCacheKey{file}); foreach my $oOption ($oConfig->nodeList('postgres-config-option')) { my $strKey = $oOption->paramGet('key'); my $strValue = $self->{oManifest}->variableReplace(trim($oOption->valueGet())); if ($strValue eq '') { delete($$oConfigHashNew{$strKey}); &log(DEBUG, (' ' x ($iDepth + 1)) . "reset ${strKey}"); } else { $$oConfigHashNew{$strKey} = $strValue; &log(DEBUG, (' ' x ($iDepth + 1)) . "set ${strKey} = ${strValue}"); } } # Generate config text foreach my $strKey (sort(keys(%$oConfigHashNew))) { if (defined($strConfig)) { $strConfig .= "\n"; } $strConfig .= "${strKey} = $$oConfigHashNew{$strKey}"; } # Save the conf file if ($self->{bExe}) { $self->{oManifest}->storage()->put($strLocalFile, $$oConfigHash{base} . (defined($strConfig) ? "\n# pgBackRest Configuration\n${strConfig}\n" : '')); $oHost->copyTo($strLocalFile, $$hCacheKey{file}, 'postgres:postgres', '640'); } $$oConfigHash{old} = $oConfigHashNew; my @stryConfig = undef; if (trim($strConfig) ne '') { @stryConfig = split("\n", $strConfig); } $$hCacheValue{config} = \@stryConfig; $self->cachePush($strCacheType, $hCacheKey, $hCacheValue); } } else { $strConfig = 'Config suppressed for testing'; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strFile', value => $strFile, trace => true}, {name => 'strConfig', value => $strConfig, trace => true}, {name => 'bShow', value => $oConfig->paramTest('show', 'n') ? false : true, trace => true} ); } #################################################################################################################################### # hostKey #################################################################################################################################### sub hostKey { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oHost, ) = logDebugParam ( __PACKAGE__ . '->hostKey', \@_, {name => 'oHost', trace => true}, ); my $hCacheKey = { name => $self->{oManifest}->variableReplace($oHost->paramGet('name')), image => $self->{oManifest}->variableReplace($oHost->paramGet('image')), }; if (defined($oHost->paramGet('id', false))) { $hCacheKey->{id} = $self->{oManifest}->variableReplace($oHost->paramGet('id')); } else { $hCacheKey->{id} = $hCacheKey->{name}; } if (defined($oHost->paramGet('option', false))) { $$hCacheKey{option} = $self->{oManifest}->variableReplace($oHost->paramGet('option')); } if (defined($oHost->paramGet('param', false))) { $$hCacheKey{param} = $self->{oManifest}->variableReplace($oHost->paramGet('param')); } if (defined($oHost->paramGet('os', false))) { $$hCacheKey{os} = $self->{oManifest}->variableReplace($oHost->paramGet('os')); } $$hCacheKey{'update-hosts'} = $oHost->paramTest('update-hosts', 'n') ? JSON::PP::false : JSON::PP::true; # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'hCacheKey', value => $hCacheKey, trace => true} ); } #################################################################################################################################### # cachePop #################################################################################################################################### sub cachePop { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strCacheType, $hCacheKey, ) = logDebugParam ( __PACKAGE__ . '->hostKey', \@_, {name => 'strCacheType', trace => true}, {name => 'hCacheKey', trace => true}, ); my $bCacheHit = false; my $oCacheValue = undef; if ($self->{bCache}) { my $oJSON = JSON::PP->new()->canonical()->allow_nonref(); # &log(WARN, "checking cache for\ncurrent key: " . $oJSON->encode($hCacheKey)); my $hCache = ${$self->{oSource}{hyCache}}[$self->{iCacheIdx}]; if (!defined($hCache)) { confess &log(ERROR, 'unable to get index from cache', ERROR_FILE_INVALID); } if (!defined($$hCache{key})) { confess &log(ERROR, 'unable to get key from cache', ERROR_FILE_INVALID); } if (!defined($$hCache{type})) { confess &log(ERROR, 'unable to get type from cache', ERROR_FILE_INVALID); } if ($$hCache{type} ne $strCacheType) { confess &log(ERROR, 'types do not match, cache is invalid', ERROR_FILE_INVALID); } if ($oJSON->encode($$hCache{key}) ne $oJSON->encode($hCacheKey)) { confess &log(ERROR, "keys at index $self->{iCacheIdx} do not match, cache is invalid." . "\n cache key: " . $oJSON->encode($$hCache{key}) . "\ncurrent key: " . $oJSON->encode($hCacheKey), ERROR_FILE_INVALID); } $bCacheHit = true; $oCacheValue = $$hCache{value}; $self->{iCacheIdx}++; } else { if ($self->{oManifest}{bCacheOnly}) { confess &log(ERROR, 'Cache only operation forced by --cache-only option'); } } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'bCacheHit', value => $bCacheHit, trace => true}, {name => 'strCacheType', value => $strCacheType, trace => true}, {name => 'hCacheKey', value => $hCacheKey, trace => true}, {name => 'oCacheValue', value => $oCacheValue, trace => true}, ); } #################################################################################################################################### # cachePush #################################################################################################################################### sub cachePush { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strType, $hCacheKey, $oCacheValue, ) = logDebugParam ( __PACKAGE__ . '->hostKey', \@_, {name => 'strType', trace => true}, {name => 'hCacheKey', trace => true}, {name => 'oCacheValue', required => false, trace => true}, ); if ($self->{bCache}) { confess &log(ASSERT, "cachePush should not be called when cache is already present"); } # Create the cache entry my $hCache = { key => $hCacheKey, type => $strType, }; if (defined($oCacheValue)) { $$hCache{value} = $oCacheValue; } push @{$self->{oSource}{hyCache}}, $hCache; # Return from function and log return values if any return logDebugReturn($strOperation); } #################################################################################################################################### # sectionChildProcesss #################################################################################################################################### sub sectionChildProcess { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oSection, $oChild, $iDepth ) = logDebugParam ( __PACKAGE__ . '->sectionChildProcess', \@_, {name => 'oSection'}, {name => 'oChild'}, {name => 'iDepth'} ); &log(DEBUG, (' ' x ($iDepth + 1)) . 'process child: ' . $oChild->nameGet()); # Execute a command if ($oChild->nameGet() eq 'host-add') { if ($self->{bExe} && $self->isRequired($oSection)) { my ($bCacheHit, $strCacheType, $hCacheKey, $hCacheValue) = $self->cachePop('host', $self->hostKey($oChild)); if ($bCacheHit) { $self->{oManifest}->variableSet('host-' . $hCacheKey->{id} . '-ip', $hCacheValue->{ip}, true); } else { if (defined($self->{host}{$$hCacheKey{name}})) { confess &log(ERROR, 'cannot add host ${strName} because the host already exists'); } executeTest("rm -rf ~/data/$$hCacheKey{name}"); executeTest("mkdir -p ~/data/$$hCacheKey{name}/etc"); my $strHost = $hCacheKey->{name}; my $strImage = $hCacheKey->{image}; my $strHostUser = $self->{oManifest}->variableReplace($oChild->paramGet('user')); # Determine if a pre-built image should be created if (defined($self->preExecute($strHost))) { my $strPreImage = "${strImage}-${strHost}"; my $strFrom = $strImage; &log(INFO, "Build vm '${strPreImage}' from '${strFrom}'"); my $strCommandList; # Add all pre commands foreach my $oExecute ($self->preExecute($strHost)) { my $hExecuteKey = $self->executeKey($strHost, $oExecute); my $strCommand = join("\n", @{$hExecuteKey->{cmd}}) . (defined($hExecuteKey->{'cmd-extra'}) ? ' ' . $hExecuteKey->{'cmd-extra'} : ''); $strCommand =~ s/'/'\\''/g; $strCommand = "sudo -u ${strHostUser}" . ($hCacheKey->{'bash-wrap'} ? " bash" . ($hCacheKey->{'load-env'} ? ' -l' : '') . " -c '${strCommand}'" : " ${strCommand}"); if (defined($strCommandList)) { $strCommandList .= "\n"; } $strCommandList .= "RUN ${strCommand}"; &log(DETAIL, " Pre command $strCommand"); } # Build container my $strDockerfile = $self->{oManifest}{strDocPath} . "/output/doc-host.dockerfile"; $self->{oManifest}{oStorage}->put( $strDockerfile, "FROM ${strFrom}\n\n" . trim($self->{oManifest}->variableReplace($strCommandList)) . "\n"); executeTest("docker build -f ${strDockerfile} -t ${strPreImage} " . $self->{oManifest}{oStorage}->pathGet()); # Use the pre-built image $strImage = $strPreImage; } my $strHostRepoPath = dirname(dirname(abs_path($0))); # Replace host repo path in mounts with if present my $strMount = undef; if (defined($oChild->paramGet('mount', false))) { $strMount = $self->{oManifest}->variableReplace($oChild->paramGet('mount')); $strMount =~ s/\{\[host\-repo\-path\]\}/${strHostRepoPath}/g; } # Replace host repo mount in params if present my $strOption = $$hCacheKey{option}; if (defined($strOption)) { $strOption =~ s/\{\[host\-repo\-path\]\}/${strHostRepoPath}/g; } my $oHost = new pgBackRestTest::Common::HostTest( $$hCacheKey{name}, "doc-$$hCacheKey{name}", $strImage, $strHostUser, $$hCacheKey{os}, defined($strMount) ? [$strMount] : undef, $strOption, $$hCacheKey{param}, $$hCacheKey{'update-hosts'}); $self->{host}{$$hCacheKey{name}} = $oHost; $self->{oManifest}->variableSet('host-' . $hCacheKey->{id} . '-ip', $oHost->{strIP}, true); $$hCacheValue{ip} = $oHost->{strIP}; # Add to the host group my $oHostGroup = hostGroupGet(); $oHostGroup->hostAdd($oHost); # Execute initialize commands foreach my $oExecute ($oChild->nodeList('execute', false)) { $self->execute( $oSection, $$hCacheKey{name}, $oExecute, {iIndent => $iDepth + 1, bCache => false, bShow => false}); } $self->cachePush($strCacheType, $hCacheKey, $hCacheValue); } } } # Skip children that have already been processed and error on others elsif ($oChild->nameGet() ne 'title') { confess &log(ASSERT, 'unable to process child type ' . $oChild->nameGet()); } # Return from function and log return values if any return logDebugReturn ( $strOperation ); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Common/DocManifest.pm000066400000000000000000000636151416457663300252120ustar00rootroot00000000000000#################################################################################################################################### # DOC MANIFEST MODULE #################################################################################################################################### package pgBackRestDoc::Common::DocManifest; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use Cwd qw(abs_path); use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname); use JSON::PP; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; #################################################################################################################################### # File constants #################################################################################################################################### use constant FILE_MANIFEST => 'manifest.xml'; #################################################################################################################################### # Render constants #################################################################################################################################### use constant RENDER => 'render'; use constant RENDER_COMPACT => 'compact'; push @EXPORT, qw(RENDER_COMPACT); use constant RENDER_FILE => 'file'; use constant RENDER_MENU => 'menu'; push @EXPORT, qw(RENDER_MENU); use constant RENDER_PRETTY => 'pretty'; push @EXPORT, qw(RENDER_PRETTY); use constant RENDER_TYPE => 'type'; use constant RENDER_TYPE_HTML => 'html'; push @EXPORT, qw(RENDER_TYPE_HTML); use constant RENDER_TYPE_MARKDOWN => 'markdown'; push @EXPORT, qw(RENDER_TYPE_MARKDOWN); use constant RENDER_TYPE_PDF => 'pdf'; push @EXPORT, qw(RENDER_TYPE_PDF); #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Create the class hash my $self = {}; bless $self, $class; # Assign function parameters, defaults, and log debug info ( my $strOperation, $self->{oStorage}, $self->{stryRequire}, $self->{stryInclude}, $self->{stryExclude}, $self->{rhKeyVariableOverride}, my $rhVariableOverride, $self->{strDocPath}, $self->{bDeploy}, $self->{bCacheOnly}, $self->{bPre}, ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'oStorage'}, {name => 'stryRequire'}, {name => 'stryInclude'}, {name => 'stryExclude'}, {name => 'rhKeyVariableOverride', required => false}, {name => 'rhVariableOverride', required => false}, {name => 'strDocPath', required => false}, {name => 'bDeploy', required => false}, {name => 'bCacheOnly', required => false}, {name => 'bPre', required => false, default => false}, ); # Set the bin path $self->{strBinPath} = abs_path(dirname($0)); # Set the base path if it was not passed in if (!defined($self->{strDocPath})) { $self->{strDocPath} = $self->{strBinPath}; } # Set cache file names $self->{strExeCacheLocal} = $self->{strDocPath} . "/output/exe.cache"; $self->{strExeCacheDeploy} = $self->{strDocPath} . "/resource/exe.cache"; # Load the manifest $self->{oManifestXml} = new pgBackRestDoc::Common::Doc("$self->{strDocPath}/manifest.xml"); # Iterate the sources $self->{oManifest} = {}; foreach my $oSource ($self->{oManifestXml}->nodeGet('source-list')->nodeList('source')) { my $oSourceHash = {}; my $strKey = $oSource->paramGet('key'); my $strSourceType = $oSource->paramGet('type', false); logDebugMisc ( $strOperation, 'load source', {name => 'strKey', value => $strKey}, {name => 'strSourceType', value => $strSourceType} ); # Skip sources in exclude list if (grep(/^$strKey$/, @{$self->{stryExclude}})) { next; } # Help is in src/build/help if ($strKey eq 'help') { $oSourceHash->{doc} = new pgBackRestDoc::Common::Doc("$self->{strDocPath}/../src/build/help/${strKey}.xml"); } # Else should be in doc/xml else { $$oSourceHash{doc} = new pgBackRestDoc::Common::Doc("$self->{strDocPath}/xml/${strKey}.xml"); } # Read variables from source $self->variableListParse($$oSourceHash{doc}->nodeGet('variable-list', false), $rhVariableOverride); ${$self->{oManifest}}{source}{$strKey} = $oSourceHash; ${$self->{oManifest}}{source}{$strKey}{strSourceType} = $strSourceType; } # Iterate the renderers foreach my $oRender ($self->{oManifestXml}->nodeGet('render-list')->nodeList('render')) { my $oRenderHash = {}; my $strType = $oRender->paramGet(RENDER_TYPE); # Only one instance of each render type can be defined if (defined(${$self->{oManifest}}{&RENDER}{$strType})) { confess &log(ERROR, "render ${strType} has already been defined"); } # Get the file param $${oRenderHash}{file} = $oRender->paramGet(RENDER_FILE, false); $${oRenderHash}{&RENDER_COMPACT} = $oRender->paramGet(RENDER_COMPACT, false, 'n') eq 'y' ? true : false; $${oRenderHash}{&RENDER_PRETTY} = $oRender->paramGet(RENDER_PRETTY, false, 'n') eq 'y' ? true : false; $${oRenderHash}{&RENDER_MENU} = false; logDebugMisc ( $strOperation, ' load render', {name => 'strType', value => $strType}, {name => 'strFile', value => $${oRenderHash}{file}} ); # Error if file is set and render type is not pdf if (defined($${oRenderHash}{file}) && $strType ne RENDER_TYPE_PDF) { confess &log(ERROR, 'only the pdf render type can have file set') } # Iterate the render sources foreach my $oRenderOut ($oRender->nodeList('render-source')) { my $oRenderOutHash = {}; my $strKey = $oRenderOut->paramGet('key'); my $strSource = $oRenderOut->paramGet('source', false, $strKey); # Skip sources in exclude list if (grep(/^$strSource$/, @{$self->{stryExclude}})) { next; } # Skip sources not in include list if (@{$self->{stryInclude}} > 0 && !grep(/^$strSource$/, @{$self->{stryInclude}})) { next; } # Preserve natural order push(@{$${oRenderHash}{stryOrder}}, $strKey); $$oRenderOutHash{source} = $strSource; # Get the filename if (defined($oRenderOut->paramGet('file', false))) { if ($strType eq RENDER_TYPE_HTML || $strType eq RENDER_TYPE_MARKDOWN) { $$oRenderOutHash{file} = $oRenderOut->paramGet('file'); } else { confess &log(ERROR, "file is only valid with html or markdown render types"); } } # Get the menu caption if (defined($oRenderOut->paramGet('menu', false)) && $strType ne RENDER_TYPE_HTML) { confess &log(ERROR, "menu is only valid with html render type"); } if (defined($oRenderOut->paramGet('menu', false))) { $${oRenderHash}{&RENDER_MENU} = true; if ($strType eq RENDER_TYPE_HTML) { $$oRenderOutHash{menu} = $oRenderOut->paramGet('menu', false); } else { confess &log(ERROR, 'only the html render type can have menu set'); } } logDebugMisc ( $strOperation, ' load render source', {name => 'strKey', value => $strKey}, {name => 'strSource', value => $strSource}, {name => 'strMenu', value => $${oRenderOutHash}{menu}} ); $${oRenderHash}{out}{$strKey} = $oRenderOutHash; } ${$self->{oManifest}}{render}{$strType} = $oRenderHash; } # Set the doc path variable $self->variableSet('doc-path', $self->{strDocPath}); # Read variables from manifest $self->variableListParse($self->{oManifestXml}->nodeGet('variable-list', false), $rhVariableOverride); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # isBackRest # # Until all the backrest specific code can be abstracted, this function will identify when BackRest docs are being built. #################################################################################################################################### sub isBackRest { my $self = shift; return($self->variableTest('project-exe', 'pgbackrest')); } #################################################################################################################################### # Evaluate the if condition for a node #################################################################################################################################### sub evaluateIf { my $self = shift; my $oNode = shift; my $bIf = true; # Evaluate if condition if (defined($oNode->paramGet('if', false))) { my $strIf = $self->variableReplace($oNode->paramGet('if')); # In this case we really do want to evaluate the contents and not treat it as a literal $bIf = eval($strIf); # Error if the eval failed if ($@) { confess &log(ERROR, "unable to evaluate '${strIf}': $@"); } } return $bIf; } #################################################################################################################################### # variableListParse # # Parse a variable list and store variables. #################################################################################################################################### sub variableListParse { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oVariableList, $rhVariableOverride ) = logDebugParam ( __PACKAGE__ . '->variableListParse', \@_, {name => '$oVariableList', required => false}, {name => '$rhVariableOverride', required => false} ); if (defined($oVariableList)) { foreach my $oVariable ($oVariableList->nodeList('variable')) { if ($self->evaluateIf($oVariable)) { my $strKey = $oVariable->paramGet('key'); my $strValue = $self->variableReplace($oVariable->valueGet()); if ($oVariable->paramTest('eval', 'y')) { # In this case we really do want to evaluate the contents of strValue and not treat it as a literal. $strValue = eval($strValue); if ($@) { confess &log(ERROR, "unable to evaluate ${strKey}: $@\n" . $oVariable->valueGet()); } } $self->variableSet($strKey, defined($rhVariableOverride->{$strKey}) ? $rhVariableOverride->{$strKey} : $strValue); logDebugMisc ( $strOperation, ' load variable', {name => 'strKey', value => $strKey}, {name => 'strValue', value => $strValue} ); } } } # Return from function and log return values if any return logDebugReturn($strOperation); } #################################################################################################################################### # variableReplace # # Replace variables in the string. #################################################################################################################################### sub variableReplace { my $self = shift; my $strBuffer = shift; my $strType = shift; if (!defined($strBuffer)) { return; } foreach my $strName (sort(keys(%{$self->{oVariable}}))) { my $strValue = $self->{oVariable}{$strName}; $strBuffer =~ s/\{\[$strName\]\}/$strValue/g; } if (defined($strType) && $strType eq 'latex') { $strBuffer =~ s/\\\_/\_/g; $strBuffer =~ s/\_/\\\_/g; $strBuffer =~ s/\\\#/\#/g; $strBuffer =~ s/\#/\\\#/g; } return $strBuffer; } #################################################################################################################################### # variableSet # # Set a variable to be replaced later. #################################################################################################################################### sub variableSet { my $self = shift; my $strKey = shift; my $strValue = shift; my $bForce = shift; if (defined(${$self->{oVariable}}{$strKey}) && (!defined($bForce) || !$bForce)) { confess &log(ERROR, "${strKey} variable is already defined"); } ${$self->{oVariable}}{$strKey} = $self->variableReplace($strValue); } #################################################################################################################################### # variableGet # # Get the current value of a variable. #################################################################################################################################### sub variableGet { my $self = shift; my $strKey = shift; return ${$self->{oVariable}}{$strKey}; } #################################################################################################################################### # variableTest # # Test that a variable is defined or has an expected value. #################################################################################################################################### sub variableTest { my $self = shift; my $strKey = shift; my $strExpectedValue = shift; # Get the variable my $strValue = ${$self->{oVariable}}{$strKey}; # Return false if it is not defined if (!defined($strValue)) { return false; } # Return false if it does not equal the expected value if (defined($strExpectedValue) && $strValue ne $strExpectedValue) { return false; } return true; } #################################################################################################################################### # Get list of source documents #################################################################################################################################### sub sourceList { my $self = shift; # Assign function parameters, defaults, and log debug info my ($strOperation) = logDebugParam(__PACKAGE__ . '->sourceList'); # Check that sources exist my @strySource; if (defined(${$self->{oManifest}}{source})) { @strySource = sort(keys(%{${$self->{oManifest}}{source}})); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strySource', value => \@strySource} ); } #################################################################################################################################### # sourceGet #################################################################################################################################### sub sourceGet { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strSource ) = logDebugParam ( __PACKAGE__ . '->sourceGet', \@_, {name => 'strSource', trace => true} ); if (!defined(${$self->{oManifest}}{source}{$strSource})) { confess &log(ERROR, "source ${strSource} does not exist"); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oSource', value => ${$self->{oManifest}}{source}{$strSource}} ); } #################################################################################################################################### # renderList #################################################################################################################################### sub renderList { my $self = shift; # Assign function parameters, defaults, and log debug info my ($strOperation) = logDebugParam(__PACKAGE__ . '->renderList'); # Check that the render output exists my @stryRender; if (defined(${$self->{oManifest}}{render})) { @stryRender = sort(keys(%{${$self->{oManifest}}{render}})); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'stryRender', value => \@stryRender} ); } #################################################################################################################################### # renderGet #################################################################################################################################### sub renderGet { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strType ) = logDebugParam ( __PACKAGE__ . '->renderGet', \@_, {name => 'strType', trace => true} ); # Check that the render exists if (!defined(${$self->{oManifest}}{render}{$strType})) { confess &log(ERROR, "render type ${strType} does not exist"); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oRenderOut', value => ${$self->{oManifest}}{render}{$strType}} ); } #################################################################################################################################### # renderOutList #################################################################################################################################### sub renderOutList { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strType ) = logDebugParam ( __PACKAGE__ . '->renderOutList', \@_, {name => 'strType'} ); # Check that the render output exists my @stryRenderOut; if (defined(${$self->{oManifest}}{render}{$strType})) { @stryRenderOut = sort(keys(%{${$self->{oManifest}}{render}{$strType}{out}})); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'stryRenderOut', value => \@stryRenderOut} ); } #################################################################################################################################### # renderOutGet #################################################################################################################################### sub renderOutGet { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strType, $strKey, $bIgnoreMissing, ) = logDebugParam ( __PACKAGE__ . '->renderOutGet', \@_, {name => 'strType', trace => true}, {name => 'strKey', trace => true}, {name => 'bIgnoreMissing', default => false, trace => true}, ); if (!defined(${$self->{oManifest}}{render}{$strType}{out}{$strKey}) && !$bIgnoreMissing) { confess &log(ERROR, "render out ${strKey} does not exist"); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oRenderOut', value => ${$self->{oManifest}}{render}{$strType}{out}{$strKey}} ); } #################################################################################################################################### # cacheKey #################################################################################################################################### sub cacheKey { my $self = shift; # Assign function parameters, defaults, and log debug info my ($strOperation) = logDebugParam(__PACKAGE__ . '->cacheKey'); # Generate a cache key from the variable override my $strVariableKey = JSON::PP->new()->canonical()->allow_nonref()->encode($self->{rhKeyVariableOverride}); if ($strVariableKey eq '{}') { $strVariableKey = 'default'; } my $strRequire = defined($self->{stryRequire}) && @{$self->{stryRequire}} > 0 ? join("\n", @{$self->{stryRequire}}) : 'all'; # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strVariableKey', value => $strVariableKey}, {name => 'strRequire', value => $strRequire}, ); } #################################################################################################################################### # cacheRead #################################################################################################################################### sub cacheRead { my $self = shift; # Assign function parameters, defaults, and log debug info my ($strOperation) = logDebugParam(__PACKAGE__ . '->cacheRead'); $self->{hCache} = undef; my $strCacheFile = $self->{bDeploy} ? $self->{strExeCacheDeploy} : $self->{strExeCacheLocal}; if (!$self->storage()->exists($strCacheFile) && !$self->{bDeploy}) { $strCacheFile = $self->{strExeCacheDeploy}; } if ($self->storage()->exists($strCacheFile)) { my ($strCacheKey, $strRequire) = $self->cacheKey(); my $oJSON = JSON::PP->new()->allow_nonref(); $self->{hCache} = $oJSON->decode(${$self->storage()->get($strCacheFile)}); foreach my $strSource (sort(keys(%{${$self->{oManifest}}{source}}))) { my $hSource = ${$self->{oManifest}}{source}{$strSource}; if (defined(${$self->{hCache}}{$strCacheKey}{$strRequire}{$strSource})) { $$hSource{hyCache} = ${$self->{hCache}}{$strCacheKey}{$strRequire}{$strSource}; &log(DETAIL, "cache load $strSource (key = ${strCacheKey}, require = ${strRequire})"); } } } # Return from function and log return values if any return logDebugReturn($strOperation); } #################################################################################################################################### # cacheWrite #################################################################################################################################### sub cacheWrite { my $self = shift; # Assign function parameters, defaults, and log debug info my ($strOperation) = logDebugParam(__PACKAGE__ . '->cacheWrite'); my $strCacheFile = $self->{bDeploy} ? $self->{strExeCacheDeploy} : $self->{strExeCacheLocal}; my ($strCacheKey, $strRequire) = $self->cacheKey(); foreach my $strSource (sort(keys(%{${$self->{oManifest}}{source}}))) { my $hSource = ${$self->{oManifest}}{source}{$strSource}; if (defined($$hSource{hyCache})) { ${$self->{hCache}}{$strCacheKey}{$strRequire}{$strSource} = $$hSource{hyCache}; &log(DETAIL, "cache load $strSource (key = ${strCacheKey}, require = ${strRequire})"); } } if (defined($self->{hCache})) { my $oJSON = JSON::PP->new()->canonical()->allow_nonref()->pretty(); $self->storage()->put($strCacheFile, $oJSON->encode($self->{hCache})); } # Return from function and log return values if any return logDebugReturn($strOperation); } #################################################################################################################################### # cacheReset #################################################################################################################################### sub cacheReset { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strSource ) = logDebugParam ( __PACKAGE__ . '->cacheReset', \@_, {name => 'strSource', trace => true} ); if ($self->{bCacheOnly}) { confess &log(ERROR, 'Cache reset disabled by --cache-only option'); } &log(WARN, "Cache will be reset for source ${strSource} and rendering retried automatically"); delete(${$self->{oManifest}}{source}{$strSource}{hyCache}); # Return from function and log return values if any return logDebugReturn($strOperation); } #################################################################################################################################### # Getters #################################################################################################################################### sub storage {shift->{oStorage}}; 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Common/DocRender.pm000066400000000000000000001073561416457663300246640ustar00rootroot00000000000000#################################################################################################################################### # DOC RENDER MODULE #################################################################################################################################### package pgBackRestDoc::Common::DocRender; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use Exporter qw(import); our @EXPORT = qw(); use JSON::PP; use Storable qw(dclone); use pgBackRestDoc::Common::DocManifest; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; #################################################################################################################################### # XML tag/param constants #################################################################################################################################### use constant XML_SECTION_PARAM_ANCHOR => 'anchor'; push @EXPORT, qw(XML_SECTION_PARAM_ANCHOR); use constant XML_SECTION_PARAM_ANCHOR_VALUE_NOINHERIT => 'no-inherit'; push @EXPORT, qw(XML_SECTION_PARAM_ANCHOR_VALUE_NOINHERIT); #################################################################################################################################### # Render tags for various output types #################################################################################################################################### my $oRenderTag = { 'markdown' => { 'quote' => ['"', '"'], 'b' => ['**', '**'], 'i' => ['_', '_'], # 'bi' => ['_**', '**_'], 'list-item' => ["\n", ""], 'list-item' => ['- ', "\n"], 'id' => ['`', '`'], 'file' => ['`', '`'], 'path' => ['`', '`'], 'cmd' => ['`', '`'], 'param' => ['`', '`'], 'setting' => ['`', '`'], 'pg-setting' => ['`', '`'], 'code' => ['`', '`'], # 'code-block' => ['```', '```'], # 'exe' => [undef, ''], 'backrest' => [undef, ''], 'proper' => ['', ''], 'postgres' => ['PostgreSQL', ''], 'admonition' => ["\n> **", "\n"], }, 'text' => { 'quote' => ['"', '"'], 'p' => ['', "\n\n"], 'b' => ['', ''], 'i' => ['', ''], # 'bi' => ['', ''], 'list' => ["", "\n"], 'list-item' => ['* ', "\n"], 'id' => ['', ''], 'host' => ['', ''], 'file' => ['', ''], 'path' => ['', ''], 'cmd' => ['', ''], 'br-option' => ['', ''], 'pg-setting' => ['', ''], 'param' => ['', ''], 'setting' => ['', ''], 'code' => ['', ''], 'code-block' => ['', ''], 'exe' => [undef, ''], 'backrest' => [undef, ''], 'proper' => ['', ''], 'postgres' => ['PostgreSQL', ''], 'admonition' => ['', "\n\n"], }, 'latex' => { 'quote' => ['``', '"'], 'p' => ["\n\\begin{sloppypar}", "\\end{sloppypar}\n"], 'b' => ['\textbf{', '}'], 'i' => ['\textit{', '}'], # 'bi' => ['', ''], 'list' => ["\\begin{itemize}\n", "\\end{itemize}\n"], 'list-item' => ['\item ', "\n"], 'id' => ['\textnormal{\texttt{', '}}'], 'host' => ['\textnormal{\textbf{', '}}'], 'file' => ['\textnormal{\texttt{', '}}'], 'path' => ['\textnormal{\texttt{', '}}'], 'cmd' => ['\textnormal{\texttt{', "}}"], 'user' => ['\textnormal{\texttt{', '}}'], 'br-option' => ['', ''], # 'param' => ['\texttt{', '}'], # 'setting' => ['\texttt{', '}'], 'br-option' => ['\textnormal{\texttt{', '}}'], 'br-setting' => ['\textnormal{\texttt{', '}}'], 'pg-option' => ['\textnormal{\texttt{', '}}'], 'pg-setting' => ['\textnormal{\texttt{', '}}'], 'code' => ['\textnormal{\texttt{', '}}'], # 'code' => ['\texttt{', '}'], # 'code-block' => ['', ''], # 'exe' => [undef, ''], 'backrest' => [undef, ''], 'proper' => ['\textnormal{\texttt{', '}}'], 'postgres' => ['PostgreSQL', ''], 'admonition' => ["\n\\vspace{.5em}\\begin{leftbar}\n\\begin{sloppypar}\\textit{\\textbf{", "}\\end{sloppypar}\n\\end{leftbar}\n"], }, 'html' => { 'quote' => ['', ''], 'b' => ['', ''], 'i' => ['', ''], 'p' => ['', ''], # 'bi' => ['', ''], 'list' => ['
    ', '
'], 'list-item' => ['
  • ', '
  • '], 'id' => ['', ''], 'host' => ['', ''], 'file' => ['', ''], 'path' => ['', ''], 'cmd' => ['', ''], 'user' => ['', ''], 'br-option' => ['', ''], 'br-setting' => ['', ''], 'pg-option' => ['', ''], 'pg-setting' => ['', ''], 'code' => ['', ''], 'code-block' => ['', ''], 'exe' => [undef, ''], 'setting' => ['', ''], # ??? This will need to be fixed 'backrest' => [undef, ''], 'proper' => ['', ''], 'postgres' => ['PostgreSQL', ''], 'admonition' => ['
    ', '
    '], } }; #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Create the class hash my $self = {}; bless $self, $class; # Assign function parameters, defaults, and log debug info ( my $strOperation, $self->{strType}, $self->{oManifest}, $self->{bExe}, $self->{strRenderOutKey}, ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'strType'}, {name => 'oManifest', required => false}, {name => 'bExe', required => false}, {name => 'strRenderOutKey', required => false} ); # Create JSON object $self->{oJSON} = JSON::PP->new()->allow_nonref(); # Initialize project tags $$oRenderTag{markdown}{backrest}[0] = "{[project]}"; $$oRenderTag{markdown}{exe}[0] = "{[project-exe]}"; $$oRenderTag{text}{backrest}[0] = "{[project]}"; $$oRenderTag{text}{exe}[0] = "{[project-exe]}"; $$oRenderTag{latex}{backrest}[0] = "{[project]}"; $$oRenderTag{latex}{exe}[0] = "\\textnormal\{\\texttt\{[project-exe]}}\}\}"; $$oRenderTag{html}{backrest}[0] = "{[project]}"; $$oRenderTag{html}{exe}[0] = "{[project-exe]}"; if (defined($self->{strRenderOutKey})) { # Copy page data to self my $oRenderOut = $self->{oManifest}->renderOutGet($self->{strType} eq 'latex' ? 'pdf' : $self->{strType}, $self->{strRenderOutKey}); # If these are the backrest docs then load the help if ($self->{oManifest}->isBackRest()) { $self->{oReference} = new pgBackRestDoc::Common::DocConfig(${$self->{oManifest}->sourceGet('help')}{doc}, $self); } if (defined($$oRenderOut{source}) && $$oRenderOut{source} eq 'help' && $self->{oManifest}->isBackRest()) { if ($self->{strRenderOutKey} eq 'configuration') { $self->{oDoc} = $self->{oReference}->helpConfigDocGet(); } elsif ($self->{strRenderOutKey} eq 'command') { $self->{oDoc} = $self->{oReference}->helpCommandDocGet(); } else { confess &log(ERROR, "cannot render $self->{strRenderOutKey} from source $$oRenderOut{source}"); } } elsif (defined($$oRenderOut{source}) && $$oRenderOut{source} eq 'release' && $self->{oManifest}->isBackRest()) { require pgBackRestDoc::Custom::DocCustomRelease; pgBackRestDoc::Custom::DocCustomRelease->import(); $self->{oDoc} = (new pgBackRestDoc::Custom::DocCustomRelease( ${$self->{oManifest}->sourceGet('release')}{doc}, defined($self->{oManifest}->variableGet('dev')) && $self->{oManifest}->variableGet('dev') eq 'y'))->docGet(); } else { $self->{oDoc} = ${$self->{oManifest}->sourceGet($self->{strRenderOutKey})}{doc}; } $self->{oSource} = $self->{oManifest}->sourceGet($$oRenderOut{source}); } if (defined($self->{strRenderOutKey})) { # Build the doc $self->build($self->{oDoc}); # Get required sections foreach my $strPath (@{$self->{oManifest}->{stryRequire}}) { if (substr($strPath, 0, 1) ne '/') { confess &log(ERROR, "path ${strPath} must begin with a /"); } if (!defined($self->{oSection}->{$strPath})) { confess &log(ERROR, "required section '${strPath}' does not exist"); } if (defined(${$self->{oSection}}{$strPath})) { $self->required($strPath); } } } if (defined($self->{oDoc})) { $self->{bToc} = !defined($self->{oDoc}->paramGet('toc', false)) || $self->{oDoc}->paramGet('toc') eq 'y' ? true : false; $self->{bTocNumber} = $self->{bToc} && (!defined($self->{oDoc}->paramGet('toc-number', false)) || $self->{oDoc}->paramGet('toc-number') eq 'y') ? true : false; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # Set begin and end values for a tag #################################################################################################################################### sub tagSet { my $self = shift; my $strTag = shift; my $strBegin = shift; my $strEnd = shift; $oRenderTag->{$self->{strType}}{$strTag}[0] = defined($strBegin) ? $strBegin : ''; $oRenderTag->{$self->{strType}}{$strTag}[1] = defined($strEnd) ? $strEnd : ''; } #################################################################################################################################### # variableReplace # # Replace variables in the string. #################################################################################################################################### sub variableReplace { my $self = shift; return defined($self->{oManifest}) ? $self->{oManifest}->variableReplace(shift, $self->{strType}) : shift; } #################################################################################################################################### # variableSet # # Set a variable to be replaced later. #################################################################################################################################### sub variableSet { my $self = shift; return $self->{oManifest}->variableSet(shift, shift); } #################################################################################################################################### # variableGet # # Get the current value of a variable. #################################################################################################################################### sub variableGet { my $self = shift; return $self->{oManifest}->variableGet(shift); } #################################################################################################################################### # Get pre-execute list for a host #################################################################################################################################### sub preExecute { my $self = shift; my $strHost = shift; if (defined($self->{preExecute}{$strHost})) { return @{$self->{preExecute}{$strHost}}; } return; } #################################################################################################################################### # build # # Build the section map and perform filtering. #################################################################################################################################### sub build { my $self = shift; my $oNode = shift; my $oParent = shift; my $strPath = shift; my $strPathPrefix = shift; # &log(INFO, " node " . $oNode->nameGet()); my $strName = $oNode->nameGet(); if (defined($oParent)) { # Evaluate if condition -- when false the node will be removed if (!$self->{oManifest}->evaluateIf($oNode)) { my $strDescription; if (defined($oNode->nodeGet('title', false))) { $strDescription = $self->processText($oNode->nodeGet('title')->textGet()); } &log(DEBUG, " filtered ${strName}" . (defined($strDescription) ? ": ${strDescription}" : '')); $oParent->nodeRemove($oNode); return; } } else { &log(DEBUG, ' build document'); $self->{oSection} = {}; } # Build section if ($strName eq 'section') { my $strSectionId = $oNode->paramGet('id'); &log(DEBUG, "build section [${strSectionId}]"); # Set path and parent-path for this section if (defined($strPath)) { $oNode->paramSet('path-parent', $strPath); } $strPath .= '/' . $oNode->paramGet('id'); &log(DEBUG, " path ${strPath}"); ${$self->{oSection}}{$strPath} = $oNode; $oNode->paramSet('path', $strPath); # If depend is not set then set it to the last section my $strDepend = $oNode->paramGet('depend', false); my $oContainerNode = defined($oParent) ? $oParent : $self->{oDoc}; my $oLastChild; my $strDependPrev; foreach my $oChild ($oContainerNode->nodeList('section', false)) { if ($oChild->paramGet('id') eq $oNode->paramGet('id')) { if (defined($oLastChild)) { $strDependPrev = $oLastChild->paramGet('id'); } elsif (defined($oParent->paramGet('depend', false))) { $strDependPrev = $oParent->paramGet('depend'); } last; } $oLastChild = $oChild; } if (defined($strDepend)) { if (defined($strDependPrev) && $strDepend eq $strDependPrev && !$oNode->paramTest('depend-default')) { &log(WARN, "section '${strPath}' depend is set to '${strDepend}' which is the default, best to remove" . " because it may become obsolete if a new section is added in between"); } } else { $strDepend = $strDependPrev; } # If depend is defined make sure it exists if (defined($strDepend)) { # If this is a relative depend then prepend the parent section if (index($strDepend, '/') != 0) { if (defined($oParent->paramGet('path', false))) { $strDepend = $oParent->paramGet('path') . '/' . $strDepend; } else { $strDepend = "/${strDepend}"; } } if (!defined($self->{oSection}->{$strDepend})) { confess &log(ERROR, "section '${strSectionId}' depend '${strDepend}' is not valid"); } } if (defined($strDepend)) { $oNode->paramSet('depend', $strDepend); } if (defined($strDependPrev)) { $oNode->paramSet('depend-default', $strDependPrev); } # Set log to true if this section has an execute list. This helps reduce the info logging by only showing sections that are # likely to take a log time. $oNode->paramSet('log', $self->{bExe} && $oNode->nodeList('execute-list', false) > 0 ? true : false); # If section content is being pulled from elsewhere go get the content if ($oNode->paramTest('source')) { my $oSource = ${$self->{oManifest}->sourceGet($oNode->paramGet('source'))}{doc}; # Section should not already have title defined, it should come from the source doc if ($oNode->nodeTest('title')) { confess &log(ERROR, "cannot specify title in section that sources another document"); } # Set title from source doc's title $oNode->nodeAdd('title')->textSet($oSource->paramGet('title')); foreach my $oSection ($oSource->nodeList('section')) { push(@{${$oNode->{oDoc}}{children}}, $oSection->{oDoc}); } # Set path prefix to modify all section paths further down $strPathPrefix = $strPath; # Remove source so it is not included again later $oNode->paramSet('source', undef); } } # Build link elsif ($strName eq 'link') { &log(DEBUG, 'build link [' . $oNode->valueGet() . ']'); # If the path prefix is set and this is a section if (defined($strPathPrefix) && $oNode->paramTest('section')) { my $strNewPath = $strPathPrefix . $oNode->paramGet('section'); &log(DEBUG, "modify link section from '" . $oNode->paramGet('section') . "' to '${strNewPath}'"); $oNode->paramSet('section', $strNewPath); } } # Store block defines elsif ($strName eq 'block-define') { my $strBlockId = $oNode->paramGet('id'); if (defined($self->{oyBlockDefine}{$strBlockId})) { confess &log(ERROR, "block ${strBlockId} is already defined"); } $self->{oyBlockDefine}{$strBlockId} = dclone($oNode->{oDoc}{children}); $oParent->nodeRemove($oNode); } # Copy blocks elsif ($strName eq 'block') { my $strBlockId = $oNode->paramGet('id'); if (!defined($self->{oyBlockDefine}{$strBlockId})) { confess &log(ERROR, "block ${strBlockId} is not defined"); } my $strNodeJSON = $self->{oJSON}->encode($self->{oyBlockDefine}{$strBlockId}); foreach my $oVariable ($oNode->nodeList('block-variable-replace', false)) { my $strVariableKey = $oVariable->paramGet('key'); my $strVariableReplace = $oVariable->valueGet(); $strNodeJSON =~ s/\{\[$strVariableKey\]\}/$strVariableReplace/g; } my ($iReplaceIdx, $iReplaceTotal) = $oParent->nodeReplace($oNode, $self->{oJSON}->decode($strNodeJSON)); # Build any new children that were added my $iChildIdx = 0; foreach my $oChild ($oParent->nodeList(undef, false)) { if ($iChildIdx >= $iReplaceIdx && $iChildIdx < ($iReplaceIdx + $iReplaceTotal)) { $self->build($oChild, $oParent, $strPath, $strPathPrefix); } $iChildIdx++; } } # Check for pre-execute statements elsif ($strName eq 'execute') { if ($self->{oManifest}->{bPre} && $oNode->paramGet('pre', false, 'n') eq 'y') { # Add to pre-execute list my $strHost = $self->variableReplace($oParent->paramGet('host')); push(@{$self->{preExecute}{$strHost}}, $oNode); # Skip this command so it doesn't get executed twice $oNode->paramSet('skip', 'y') } } # Iterate all text nodes if (defined($oNode->textGet(false))) { foreach my $oChild ($oNode->textGet()->nodeList(undef, false)) { if (ref(\$oChild) ne "SCALAR") { $self->build($oChild, $oNode, $strPath, $strPathPrefix); } } } # Iterate all non-text nodes foreach my $oChild ($oNode->nodeList(undef, false)) { if (ref(\$oChild) ne "SCALAR") { $self->build($oChild, $oNode, $strPath, $strPathPrefix); # If the child should be logged then log the parent as well so the hierarchy is complete if ($oChild->nameGet() eq 'section' && $oChild->paramGet('log', false, false)) { $oNode->paramSet('log', true); } } } } #################################################################################################################################### # required # # Build a list of required sections #################################################################################################################################### sub required { my $self = shift; my $strPath = shift; my $bDepend = shift; # If node is not found that means the path is invalid my $oNode = ${$self->{oSection}}{$strPath}; if (!defined($oNode)) { confess &log(ERROR, "invalid path ${strPath}"); } # Only add sections that are listed dependencies if (!defined($bDepend) || $bDepend) { # Match section and all child sections foreach my $strChildPath (sort(keys(%{$self->{oSection}}))) { if ($strChildPath =~ /^$strPath$/ || $strChildPath =~ /^$strPath\/.*$/) { if (!defined(${$self->{oSectionRequired}}{$strChildPath})) { my @stryChildPath = split('/', $strChildPath); &log(INFO, (' ' x (scalar(@stryChildPath) - 2)) . " require section: ${strChildPath}"); ${$self->{oSectionRequired}}{$strChildPath} = true; } } } } # Get the path of the current section's parent my $strParentPath = $oNode->paramGet('path-parent', false); if ($oNode->paramTest('depend')) { foreach my $strDepend (split(',', $oNode->paramGet('depend'))) { if ($strDepend !~ /^\//) { if (!defined($strParentPath)) { $strDepend = "/${strDepend}"; } else { $strDepend = "${strParentPath}/${strDepend}"; } } $self->required($strDepend, true); } } elsif (defined($strParentPath)) { $self->required($strParentPath, false); } } #################################################################################################################################### # isRequired # # Is it required to execute the section statements? #################################################################################################################################### sub isRequired { my $self = shift; my $oSection = shift; if (!defined($self->{oSectionRequired})) { return true; } my $strPath = $oSection->paramGet('path'); defined(${$self->{oSectionRequired}}{$strPath}) ? true : false; } #################################################################################################################################### # processTag #################################################################################################################################### sub processTag { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oTag ) = logDebugParam ( __PACKAGE__ . '->processTag', \@_, {name => 'oTag', trace => true} ); my $strBuffer = ""; my $strType = $self->{strType}; my $strTag = $oTag->nameGet(); if (!defined($strTag)) { require Data::Dumper; confess Dumper($oTag); } if ($strTag eq 'link') { my $strUrl = $oTag->paramGet('url', false); if (!defined($strUrl)) { my $strPage = $self->variableReplace($oTag->paramGet('page', false)); my $strSection = $oTag->paramGet('section', false); # If a page/section link points to the current page then remove the page portion if (defined($strPage) && defined($strSection) && defined($self->{strRenderOutKey}) && $strPage eq $self->{strRenderOutKey}) { undef($strPage); } # If this is a page URL if (defined($strPage)) { # If the page wasn't rendered then point at the website if (!defined($self->{oManifest}->renderOutGet($strType, $strPage, true))) { $strUrl = '{[backrest-url-base]}/' . $oTag->paramGet('page') . '.html'; } # Else point locally else { if ($strType eq 'html') { $strUrl = "${strPage}.html". (defined($strSection) ? '#' . substr($strSection, 1) : ''); } elsif ($strType eq 'markdown') { if (defined($strSection)) { confess &log( ERROR, "page and section links not supported for type ${strType}, value '" . $oTag->valueGet() . "'"); } $strUrl = "${strPage}.md"; } else { confess &log(ERROR, "page links not supported for type ${strType}, value '" . $oTag->valueGet() . "'"); } } } else { my $strSection = $oTag->paramGet('section'); my $oSection = ${$self->{oSection}}{$strSection}; if (!defined($oSection)) { confess &log(ERROR, "section link '${strSection}' does not exist"); } if (!defined($strSection)) { confess &log(ERROR, "link with value '" . $oTag->valueGet() . "' must defined url, page, or section"); } if ($strType eq 'html') { $strUrl = '#' . substr($strSection, 1); } elsif ($strType eq 'latex') { $strUrl = $strSection; } else { $strUrl = lc($self->processText($oSection->nodeGet('title')->textGet())); $strUrl =~ s/[^\w\- ]//g; $strUrl =~ s/ /-/g; $strUrl = '#' . $strUrl; } } } if ($strType eq 'html') { $strBuffer = '' . $oTag->valueGet() . ''; } elsif ($strType eq 'markdown') { $strBuffer = '[' . $oTag->valueGet() . '](' . $strUrl . ')'; } elsif ($strType eq 'latex') { if ($oTag->paramTest('url')) { $strBuffer = "\\href{$strUrl}{" . $oTag->valueGet() . "}"; } else { $strBuffer = "\\hyperref[$strUrl]{" . $oTag->valueGet() . "}"; } } elsif ($strType eq 'text') { $strBuffer = $oTag->valueGet(); } else { confess "'link' tag not valid for type ${strType}"; } } else { my $strStart = $$oRenderTag{$strType}{$strTag}[0]; my $strStop = $$oRenderTag{$strType}{$strTag}[1]; if (!defined($strStart) || !defined($strStop)) { confess &log(ERROR, "invalid type ${strType} or tag ${strTag}"); } $strBuffer .= $strStart; # Admonitions in the help materials are tags of the text element rather than field elements of the document so special # handling is required if ($strTag eq 'admonition') { $strBuffer .= $self->processAdmonitionStart($oTag); } if ($strTag eq 'p' || $strTag eq 'title' || $strTag eq 'list-item' || $strTag eq 'code-block' || $strTag eq 'summary' || $strTag eq 'admonition') { $strBuffer .= $self->processText($oTag); } elsif (defined($oTag->valueGet())) { $strBuffer .= $oTag->valueGet(); } else { foreach my $oSubTag ($oTag->nodeList(undef, false)) { $strBuffer .= $self->processTag($oSubTag); } } if ($strTag eq 'admonition') { $strBuffer .= $self->processAdmonitionEnd($oTag); } $strBuffer .= $strStop; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strBuffer', value => $strBuffer, trace => true} ); } #################################################################################################################################### # processAdmonitionStart #################################################################################################################################### sub processAdmonitionStart { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oTag ) = logDebugParam ( __PACKAGE__ . '->processAdmonitionStart', \@_, {name => 'oTag', trace => true} ); my $strType = $self->{strType}; my $strBuffer = ''; # Note that any changes to the way the HTML, markdown or latex display tags may also need to be made here if ($strType eq 'html') { my $strType = $oTag->paramGet('type'); $strBuffer = '
    ' . uc($strType) . ':
    ' . '
    '; } elsif ($strType eq 'text' || $strType eq 'markdown') { $strBuffer = uc($oTag->paramGet('type')) . ": "; } elsif ($strType eq 'latex') { $strBuffer = uc($oTag->paramGet('type')) . ": }"; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strBuffer', value => $strBuffer, trace => true} ); } #################################################################################################################################### # processAdmonitionEnd #################################################################################################################################### sub processAdmonitionEnd { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oTag ) = logDebugParam ( __PACKAGE__ . '->processAdmonitionEnd', \@_, {name => 'oTag', trace => true} ); my $strType = $self->{strType}; my $strBuffer = ''; # Note that any changes to the way the HTML, markdown or latex display tags may also need to be made here if ($strType eq 'html') { $strBuffer = '
    '; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strBuffer', value => $strBuffer, trace => true} ); } #################################################################################################################################### # processText #################################################################################################################################### sub processText { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oText ) = logDebugParam ( __PACKAGE__ . '->processText', \@_, {name => 'oText', trace => true} ); my $strType = $self->{strType}; my $strBuffer = ''; my $strLastTag = 'body'; foreach my $oNode ($oText->nodeList(undef, false)) { if (ref(\$oNode) eq "SCALAR") { if ($oNode =~ /\"/) { confess &log(ERROR, "unable to process quotes in string (use instead):\n${oNode}"); } # Skip text nodes with linefeeds since they happen between tags if (index($oNode, "\n") == -1) { $strBuffer .= $oNode; } } else { # Add br tags to separate paragraphs and linefeeds to make the output more diffable. This is needed because of the hacky # way config text is being rendered in the final document, i.e. by passing rendered HTML into divs rather than XML to be # rendered at that time. if ($strLastTag eq 'p' && $strType eq 'html') { $strBuffer .= "
    \n"; if ($oNode->nameGet() eq 'p') { $strBuffer .= "
    \n"; } } $strBuffer .= $self->processTag($oNode); $strLastTag = $oNode->nameGet(); } } # # if ($strType eq 'html') # { # # $strBuffer =~ s/^\s+|\s+$//g; # # $strBuffer =~ s/\n/\\n/g; # } # if ($strType eq 'markdown') # { # $strBuffer =~ s/^\s+|\s+$//g; $strBuffer =~ s/ +/ /g; $strBuffer =~ s/^ //smg; # } if ($strType eq 'latex') { $strBuffer =~ s/\&mdash\;/---/g; $strBuffer =~ s/\<\;/\\=/\$\\geq\$/g; # $strBuffer =~ s/\_/\\_/g; # If not a code-block, which is to be taken AS IS, then escape special characters in latex if ($oText->nameGet() ne 'code-block') { # If the previous character is not already a slash (e.g. not already escaped) then insert a slash $strBuffer =~ s/(?nameGet() eq 'list-item') { $strBuffer =~ s/\[/\{\[/g; $strBuffer =~ s/\]/\]\}/g; } $strBuffer =~ s/\©\;/{\\textcopyright}/g; $strBuffer =~ s/\&trade\;/{\\texttrademark}/g; $strBuffer =~ s/\®\;/{\\textregistered}/g; $strBuffer =~ s/\&rarr\;/{\\textrightarrow}/g; # Escape all ampersands after making any other conversions above $strBuffer =~ s/(?\=/g; } $strBuffer = $self->variableReplace($strBuffer); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strBuffer', value => $strBuffer, trace => true} ); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Common/Exception.pm000066400000000000000000000253701416457663300247500ustar00rootroot00000000000000#################################################################################################################################### # COMMON EXCEPTION MODULE #################################################################################################################################### package pgBackRestDoc::Common::Exception; use strict; use warnings FATAL => qw(all); use Carp qw(confess longmess); use Scalar::Util qw(blessed); use Exporter qw(import); our @EXPORT = qw(); #################################################################################################################################### # Error Definitions #################################################################################################################################### use constant ERROR_MINIMUM => 25; push @EXPORT, qw(ERROR_MINIMUM); use constant ERROR_MAXIMUM => 125; push @EXPORT, qw(ERROR_MAXIMUM); use constant ERROR_ASSERT => 25; push @EXPORT, qw(ERROR_ASSERT); use constant ERROR_CHECKSUM => 26; push @EXPORT, qw(ERROR_CHECKSUM); use constant ERROR_CONFIG => 27; push @EXPORT, qw(ERROR_CONFIG); use constant ERROR_FILE_INVALID => 28; push @EXPORT, qw(ERROR_FILE_INVALID); use constant ERROR_FORMAT => 29; push @EXPORT, qw(ERROR_FORMAT); use constant ERROR_OPTION_INVALID_VALUE => 32; push @EXPORT, qw(ERROR_OPTION_INVALID_VALUE); use constant ERROR_PG_RUNNING => 38; push @EXPORT, qw(ERROR_PG_RUNNING); use constant ERROR_PATH_NOT_EMPTY => 40; push @EXPORT, qw(ERROR_PATH_NOT_EMPTY); use constant ERROR_FILE_OPEN => 41; push @EXPORT, qw(ERROR_FILE_OPEN); use constant ERROR_FILE_READ => 42; push @EXPORT, qw(ERROR_FILE_READ); use constant ERROR_ARCHIVE_MISMATCH => 44; push @EXPORT, qw(ERROR_ARCHIVE_MISMATCH); use constant ERROR_ARCHIVE_DUPLICATE => 45; push @EXPORT, qw(ERROR_ARCHIVE_DUPLICATE); use constant ERROR_PATH_CREATE => 47; push @EXPORT, qw(ERROR_PATH_CREATE); use constant ERROR_LOCK_ACQUIRE => 50; push @EXPORT, qw(ERROR_LOCK_ACQUIRE); use constant ERROR_BACKUP_MISMATCH => 51; push @EXPORT, qw(ERROR_BACKUP_MISMATCH); use constant ERROR_PATH_OPEN => 53; push @EXPORT, qw(ERROR_PATH_OPEN); use constant ERROR_PATH_SYNC => 54; push @EXPORT, qw(ERROR_PATH_SYNC); use constant ERROR_FILE_MISSING => 55; push @EXPORT, qw(ERROR_FILE_MISSING); use constant ERROR_DB_CONNECT => 56; push @EXPORT, qw(ERROR_DB_CONNECT); use constant ERROR_DB_QUERY => 57; push @EXPORT, qw(ERROR_DB_QUERY); use constant ERROR_DB_MISMATCH => 58; push @EXPORT, qw(ERROR_DB_MISMATCH); use constant ERROR_PATH_REMOVE => 61; push @EXPORT, qw(ERROR_PATH_REMOVE); use constant ERROR_STOP => 62; push @EXPORT, qw(ERROR_STOP); use constant ERROR_FILE_WRITE => 64; push @EXPORT, qw(ERROR_FILE_WRITE); use constant ERROR_FEATURE_NOT_SUPPORTED => 67; push @EXPORT, qw(ERROR_FEATURE_NOT_SUPPORTED); use constant ERROR_ARCHIVE_COMMAND_INVALID => 68; push @EXPORT, qw(ERROR_ARCHIVE_COMMAND_INVALID); use constant ERROR_LINK_EXPECTED => 69; push @EXPORT, qw(ERROR_LINK_EXPECTED); use constant ERROR_LINK_DESTINATION => 70; push @EXPORT, qw(ERROR_LINK_DESTINATION); use constant ERROR_PATH_MISSING => 73; push @EXPORT, qw(ERROR_PATH_MISSING); use constant ERROR_FILE_MOVE => 74; push @EXPORT, qw(ERROR_FILE_MOVE); use constant ERROR_PATH_TYPE => 77; push @EXPORT, qw(ERROR_PATH_TYPE); use constant ERROR_DB_MISSING => 80; push @EXPORT, qw(ERROR_DB_MISSING); use constant ERROR_DB_INVALID => 81; push @EXPORT, qw(ERROR_DB_INVALID); use constant ERROR_ARCHIVE_TIMEOUT => 82; push @EXPORT, qw(ERROR_ARCHIVE_TIMEOUT); use constant ERROR_ARCHIVE_DISABLED => 87; push @EXPORT, qw(ERROR_ARCHIVE_DISABLED); use constant ERROR_FILE_OWNER => 88; push @EXPORT, qw(ERROR_FILE_OWNER); use constant ERROR_PATH_EXISTS => 92; push @EXPORT, qw(ERROR_PATH_EXISTS); use constant ERROR_FILE_EXISTS => 93; push @EXPORT, qw(ERROR_FILE_EXISTS); use constant ERROR_CRYPTO => 95; push @EXPORT, qw(ERROR_CRYPTO); use constant ERROR_INVALID => 123; push @EXPORT, qw(ERROR_INVALID); use constant ERROR_UNHANDLED => 124; push @EXPORT, qw(ERROR_UNHANDLED); use constant ERROR_UNKNOWN => 125; push @EXPORT, qw(ERROR_UNKNOWN); #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name my $strLevel = shift; # Log level my $iCode = shift; # Error code my $strMessage = shift; # ErrorMessage my $strTrace = shift; # Stack trace my $rExtra = shift; # Extra info used exclusively by the logging system my $bErrorC = shift; # Is this a C error? if ($iCode < ERROR_MINIMUM || $iCode > ERROR_MAXIMUM) { $iCode = ERROR_INVALID; } # Create the class hash my $self = {}; bless $self, $class; # Initialize exception $self->{strLevel} = $strLevel; $self->{iCode} = $iCode; $self->{strMessage} = $strMessage; $self->{strTrace} = $strTrace; $self->{rExtra} = $rExtra; $self->{bErrorC} = $bErrorC ? 1 : 0; return $self; } #################################################################################################################################### # level #################################################################################################################################### sub level { my $self = shift; return $self->{strLevel}; } #################################################################################################################################### # CODE #################################################################################################################################### sub code { my $self = shift; return $self->{iCode}; } #################################################################################################################################### # extra #################################################################################################################################### sub extra { my $self = shift; return $self->{rExtra}; } #################################################################################################################################### # MESSAGE #################################################################################################################################### sub message { my $self = shift; return $self->{strMessage}; } #################################################################################################################################### # TRACE #################################################################################################################################### sub trace { my $self = shift; return $self->{strTrace}; } #################################################################################################################################### # isException - is this a structured exception or a default Perl exception? #################################################################################################################################### sub isException { my $roException = shift; # Only check if defined if (defined($roException) && defined($$roException)) { # If a standard Exception if (blessed($$roException)) { return $$roException->isa('pgBackRestDoc::Common::Exception') ? 1 : 0; } # Else if a specially formatted string from the C library elsif ($$roException =~ /^PGBRCLIB\:[0-9]+\:/) { # Split message and discard the first part used for identification my @stryException = split(/\:/, $$roException); shift(@stryException); # Construct exception fields my $iCode = shift(@stryException) + 0; my $strTrace = shift(@stryException) . qw{:} . shift(@stryException); my $strMessage = join(':', @stryException); # Create exception $$roException = new pgBackRestDoc::Common::Exception("ERROR", $iCode, $strMessage, $strTrace, undef, 1); return 1; } } return 0; } push @EXPORT, qw(isException); #################################################################################################################################### # exceptionCode # # Extract the error code from an exception - if a Perl exception return ERROR_UNKNOWN. #################################################################################################################################### sub exceptionCode { my $oException = shift; return isException(\$oException) ? $oException->code() : ERROR_UNKNOWN; } push @EXPORT, qw(exceptionCode); #################################################################################################################################### # exceptionMessage # # Extract the error message from an exception - if a Perl exception return bare exception. #################################################################################################################################### sub exceptionMessage { my $oException = shift; return isException(\$oException) ? $oException->message() : $oException; } push @EXPORT, qw(exceptionMessage); 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Common/Ini.pm000066400000000000000000000733571416457663300235410ustar00rootroot00000000000000#################################################################################################################################### # COMMON INI MODULE #################################################################################################################################### package pgBackRestDoc::Common::Ini; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use English '-no_match_vars'; use Digest::SHA qw(sha1_hex); use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname); use JSON::PP; use Storable qw(dclone); use pgBackRestDoc::Common::Exception; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; use pgBackRestDoc::ProjectInfo; #################################################################################################################################### # Boolean constants #################################################################################################################################### use constant INI_TRUE => JSON::PP::true; push @EXPORT, qw(INI_TRUE); use constant INI_FALSE => JSON::PP::false; push @EXPORT, qw(INI_FALSE); #################################################################################################################################### # Ini control constants #################################################################################################################################### use constant INI_SECTION_BACKREST => 'backrest'; push @EXPORT, qw(INI_SECTION_BACKREST); use constant INI_KEY_CHECKSUM => 'backrest-checksum'; push @EXPORT, qw(INI_KEY_CHECKSUM); use constant INI_KEY_FORMAT => 'backrest-format'; push @EXPORT, qw(INI_KEY_FORMAT); use constant INI_KEY_VERSION => 'backrest-version'; push @EXPORT, qw(INI_KEY_VERSION); use constant INI_SECTION_CIPHER => 'cipher'; push @EXPORT, qw(INI_SECTION_CIPHER); use constant INI_KEY_CIPHER_PASS => 'cipher-pass'; push @EXPORT, qw(INI_KEY_CIPHER_PASS); #################################################################################################################################### # Ini file copy extension #################################################################################################################################### use constant INI_COPY_EXT => '.copy'; push @EXPORT, qw(INI_COPY_EXT); #################################################################################################################################### # Ini sort orders #################################################################################################################################### use constant INI_SORT_FORWARD => 'forward'; push @EXPORT, qw(INI_SORT_FORWARD); use constant INI_SORT_REVERSE => 'reverse'; push @EXPORT, qw(INI_SORT_REVERSE); use constant INI_SORT_NONE => 'none'; push @EXPORT, qw(INI_SORT_NONE); #################################################################################################################################### # new() #################################################################################################################################### sub new { my $class = shift; # Class name # Create the class hash my $self = {}; bless $self, $class; # Assign function parameters, defaults, and log debug info ( my $strOperation, $self->{oStorage}, $self->{strFileName}, my $bLoad, my $strContent, $self->{iInitFormat}, $self->{strInitVersion}, my $bIgnoreMissing, $self->{strCipherPass}, # Passphrase to read/write the file my $strCipherPassSub, # Passphrase to read/write subsequent files ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'oStorage', trace => true}, {name => 'strFileName', trace => true}, {name => 'bLoad', optional => true, default => true, trace => true}, {name => 'strContent', optional => true, trace => true}, {name => 'iInitFormat', optional => true, default => REPOSITORY_FORMAT, trace => true}, {name => 'strInitVersion', optional => true, default => PROJECT_VERSION, trace => true}, {name => 'bIgnoreMissing', optional => true, default => false, trace => true}, {name => 'strCipherPass', optional => true, trace => true}, {name => 'strCipherPassSub', optional => true, trace => true}, ); # Set changed to false $self->{bModified} = false; # Set exists to false $self->{bExists} = false; # Load the file if requested if ($bLoad) { $self->load($bIgnoreMissing); } # Load from a string if provided elsif (defined($strContent)) { $self->{oContent} = iniParse($strContent); $self->headerCheck(); } # Initialize if not loading the file and not loading from string or if a load was attempted and the file does not exist if (!$self->{bExists} && !defined($strContent)) { $self->numericSet(INI_SECTION_BACKREST, INI_KEY_FORMAT, undef, $self->{iInitFormat}); $self->set(INI_SECTION_BACKREST, INI_KEY_VERSION, undef, $self->{strInitVersion}); # Determine if the passphrase section should be set if (defined($self->{strCipherPass}) && defined($strCipherPassSub)) { $self->set(INI_SECTION_CIPHER, INI_KEY_CIPHER_PASS, undef, $strCipherPassSub); } } return $self; } #################################################################################################################################### # loadVersion() - load a version (main or copy) of the ini file #################################################################################################################################### sub loadVersion { my $self = shift; my $bCopy = shift; my $bIgnoreError = shift; # Load main my $rstrContent = $self->{oStorage}->get( $self->{oStorage}->openRead($self->{strFileName} . ($bCopy ? INI_COPY_EXT : ''), {bIgnoreMissing => $bIgnoreError, strCipherPass => $self->{strCipherPass}})); # If the file exists then attempt to parse it if (defined($rstrContent)) { my $rhContent = iniParse($$rstrContent, {bIgnoreInvalid => $bIgnoreError}); # If the content is valid then check the header if (defined($rhContent)) { $self->{oContent} = $rhContent; # If the header is invalid then undef content if (!$self->headerCheck({bIgnoreInvalid => $bIgnoreError})) { delete($self->{oContent}); } } } return defined($self->{oContent}); } #################################################################################################################################### # load() - load the ini #################################################################################################################################### sub load { my $self = shift; my $bIgnoreMissing = shift; # If main was not loaded then try the copy if (!$self->loadVersion(false, true)) { if (!$self->loadVersion(true, true)) { return if $bIgnoreMissing; confess &log(ERROR, "unable to open $self->{strFileName} or $self->{strFileName}" . INI_COPY_EXT, ERROR_FILE_MISSING); } } $self->{bExists} = true; } #################################################################################################################################### # headerCheck() - check that version and checksum in header are as expected #################################################################################################################################### sub headerCheck { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $bIgnoreInvalid, ) = logDebugParam ( __PACKAGE__ . '->headerCheck', \@_, {name => 'bIgnoreInvalid', optional => true, default => false, trace => true}, ); # Eval so exceptions can be ignored on bIgnoreInvalid my $bValid = true; eval { # Make sure the ini is valid by testing checksum my $strChecksum = $self->get(INI_SECTION_BACKREST, INI_KEY_CHECKSUM, undef, false); my $strTestChecksum = $self->hash(); if (!defined($strChecksum) || $strChecksum ne $strTestChecksum) { confess &log(ERROR, "invalid checksum in '$self->{strFileName}', expected '${strTestChecksum}' but found " . (defined($strChecksum) ? "'${strChecksum}'" : '[undef]'), ERROR_CHECKSUM); } # Make sure that the format is current, otherwise error my $iFormat = $self->get(INI_SECTION_BACKREST, INI_KEY_FORMAT, undef, false, 0); if ($iFormat != $self->{iInitFormat}) { confess &log(ERROR, "invalid format in '$self->{strFileName}', expected $self->{iInitFormat} but found ${iFormat}", ERROR_FORMAT); } # Check if the version has changed if (!$self->test(INI_SECTION_BACKREST, INI_KEY_VERSION, undef, $self->{strInitVersion})) { $self->set(INI_SECTION_BACKREST, INI_KEY_VERSION, undef, $self->{strInitVersion}); } return true; } or do { # Confess the error if it should not be ignored if (!$bIgnoreInvalid) { confess $EVAL_ERROR; } # Return false when errors are ignored $bValid = false; }; # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'bValid', value => $bValid, trace => true} ); } #################################################################################################################################### # iniParse() - parse from standard INI format to a hash. #################################################################################################################################### push @EXPORT, qw(iniParse); sub iniParse { # Assign function parameters, defaults, and log debug info my ( $strOperation, $strContent, $bRelaxed, $bIgnoreInvalid, ) = logDebugParam ( __PACKAGE__ . '::iniParse', \@_, {name => 'strContent', required => false, trace => true}, {name => 'bRelaxed', optional => true, default => false, trace => true}, {name => 'bIgnoreInvalid', optional => true, default => false, trace => true}, ); # Ini content my $oContent = undef; my $strSection; # Create the JSON object my $oJSON = JSON::PP->new()->allow_nonref(); # Eval so exceptions can be ignored on bIgnoreInvalid eval { # Read the INI file foreach my $strLine (split("\n", defined($strContent) ? $strContent : '')) { $strLine = trim($strLine); # Skip lines that are blank or comments if ($strLine ne '' && $strLine !~ '^[ ]*#.*') { # Get the section if (index($strLine, '[') == 0) { $strSection = substr($strLine, 1, length($strLine) - 2); } else { if (!defined($strSection)) { confess &log(ERROR, "key/value pair '${strLine}' found outside of a section", ERROR_CONFIG); } # Get key and value my $iIndex = index($strLine, '='); if ($iIndex == -1) { confess &log(ERROR, "unable to find '=' in '${strLine}'", ERROR_CONFIG); } my $strKey = substr($strLine, 0, $iIndex); my $strValue = substr($strLine, $iIndex + 1); # If relaxed then read the value directly if ($bRelaxed) { if (defined($oContent->{$strSection}{$strKey})) { if (ref($oContent->{$strSection}{$strKey}) ne 'ARRAY') { $oContent->{$strSection}{$strKey} = [$oContent->{$strSection}{$strKey}]; } push(@{$oContent->{$strSection}{$strKey}}, $strValue); } else { $oContent->{$strSection}{$strKey} = $strValue; } } # Else read the value as stricter JSON else { ${$oContent}{$strSection}{$strKey} = $oJSON->decode($strValue); } } } } # Error if the file is empty if (!($bRelaxed || defined($oContent))) { confess &log(ERROR, 'no key/value pairs found', ERROR_CONFIG); } return true; } or do { # Confess the error if it should not be ignored if (!$bIgnoreInvalid) { confess $EVAL_ERROR; } # Undef content when errors are ignored undef($oContent); }; # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oContent', value => $oContent, trace => true} ); } #################################################################################################################################### # save() - save the file. #################################################################################################################################### sub save { my $self = shift; # Save only if modified if ($self->{bModified}) { # Calculate the hash $self->hash(); # Save the file $self->{oStorage}->put($self->{strFileName}, iniRender($self->{oContent}), {strCipherPass => $self->{strCipherPass}}); if ($self->{oStorage}->can('pathSync')) { $self->{oStorage}->pathSync(dirname($self->{strFileName})); } $self->{oStorage}->put($self->{strFileName} . INI_COPY_EXT, iniRender($self->{oContent}), {strCipherPass => $self->{strCipherPass}}); if ($self->{oStorage}->can('pathSync')) { $self->{oStorage}->pathSync(dirname($self->{strFileName})); } $self->{bModified} = false; # Indicate the file now exists $self->{bExists} = true; # File was saved return true; } # File was not saved return false; } #################################################################################################################################### # saveCopy - save only a copy of the file. #################################################################################################################################### sub saveCopy { my $self = shift; if ($self->{oStorage}->exists($self->{strFileName})) { confess &log(ASSERT, "cannot save copy only when '$self->{strFileName}' exists"); } $self->hash(); $self->{oStorage}->put($self->{strFileName} . INI_COPY_EXT, iniRender($self->{oContent}), {strCipherPass => $self->{strCipherPass}}); } #################################################################################################################################### # iniRender() - render hash to standard INI format. #################################################################################################################################### push @EXPORT, qw(iniRender); sub iniRender { # Assign function parameters, defaults, and log debug info my ( $strOperation, $oContent, $bRelaxed, ) = logDebugParam ( __PACKAGE__ . '::iniRender', \@_, {name => 'oContent', trace => true}, {name => 'bRelaxed', default => false, trace => true}, ); # Open the ini file for writing my $strContent = ''; my $bFirst = true; # Create the JSON object canonical so that fields are alpha ordered to pass unit tests my $oJSON = JSON::PP->new()->canonical()->allow_nonref(); # Write the INI file foreach my $strSection (sort(keys(%$oContent))) { # Add a linefeed between sections if (!$bFirst) { $strContent .= "\n"; } # Write the section $strContent .= "[${strSection}]\n"; # Iterate through all keys in the section foreach my $strKey (sort(keys(%{$oContent->{$strSection}}))) { # If the value is a hash then convert it to JSON, otherwise store as is my $strValue = ${$oContent}{$strSection}{$strKey}; # If relaxed then store as old-style config if ($bRelaxed) { # If the value is an array then save each element to a separate key/value pair if (ref($strValue) eq 'ARRAY') { foreach my $strArrayValue (@{$strValue}) { $strContent .= "${strKey}=${strArrayValue}\n"; } } # Else write a standard key/value pair else { $strContent .= "${strKey}=${strValue}\n"; } } # Else write as stricter JSON else { # Skip the checksum for now but write all other key/value pairs if (!($strSection eq INI_SECTION_BACKREST && $strKey eq INI_KEY_CHECKSUM)) { $strContent .= "${strKey}=" . $oJSON->encode($strValue) . "\n"; } } } $bFirst = false; } # If there is a checksum write it at the end of the file. Having the checksum at the end of the file allows some major # performance optimizations which we won't implement in Perl, but will make the C code much more efficient. if (!$bRelaxed && defined($oContent->{&INI_SECTION_BACKREST}) && defined($oContent->{&INI_SECTION_BACKREST}{&INI_KEY_CHECKSUM})) { $strContent .= "\n[" . INI_SECTION_BACKREST . "]\n" . INI_KEY_CHECKSUM . '=' . $oJSON->encode($oContent->{&INI_SECTION_BACKREST}{&INI_KEY_CHECKSUM}) . "\n"; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strContent', value => $strContent, trace => true} ); } #################################################################################################################################### # hash() - generate hash for the manifest. #################################################################################################################################### sub hash { my $self = shift; # Remove the old checksum delete($self->{oContent}{&INI_SECTION_BACKREST}{&INI_KEY_CHECKSUM}); # Set the new checksum $self->{oContent}{&INI_SECTION_BACKREST}{&INI_KEY_CHECKSUM} = sha1_hex(JSON::PP->new()->canonical()->allow_nonref()->encode($self->{oContent})); return $self->{oContent}{&INI_SECTION_BACKREST}{&INI_KEY_CHECKSUM}; } #################################################################################################################################### # get() - get a value. #################################################################################################################################### sub get { my $self = shift; my $strSection = shift; my $strKey = shift; my $strSubKey = shift; my $bRequired = shift; my $oDefault = shift; # Parameter constraints if (!defined($strSection)) { confess &log(ASSERT, 'strSection is required'); } if (defined($strSubKey) && !defined($strKey)) { confess &log(ASSERT, "strKey is required when strSubKey '${strSubKey}' is requested"); } # Get the result my $oResult = $self->{oContent}->{$strSection}; if (defined($strKey) && defined($oResult)) { $oResult = $oResult->{$strKey}; if (defined($strSubKey) && defined($oResult)) { $oResult = $oResult->{$strSubKey}; } } # When result is not defined if (!defined($oResult)) { # Error if a result is required if (!defined($bRequired) || $bRequired) { confess &log(ASSERT, "strSection '$strSection'" . (defined($strKey) ? ", strKey '$strKey'" : '') . (defined($strSubKey) ? ", strSubKey '$strSubKey'" : '') . ' is required but not defined'); } # Return default if specified if (defined($oDefault)) { return $oDefault; } } return $oResult } #################################################################################################################################### # boolGet() - get a boolean value. #################################################################################################################################### sub boolGet { my $self = shift; my $strSection = shift; my $strKey = shift; my $strSubKey = shift; my $bRequired = shift; my $bDefault = shift; return $self->get( $strSection, $strKey, $strSubKey, $bRequired, defined($bDefault) ? ($bDefault ? INI_TRUE : INI_FALSE) : undef) ? true : false; } #################################################################################################################################### # numericGet() - get a numeric value. #################################################################################################################################### sub numericGet { my $self = shift; my $strSection = shift; my $strKey = shift; my $strSubKey = shift; my $bRequired = shift; my $nDefault = shift; return $self->get($strSection, $strKey, $strSubKey, $bRequired, defined($nDefault) ? $nDefault + 0 : undef) + 0; } #################################################################################################################################### # set - set a value. #################################################################################################################################### sub set { my $self = shift; my $strSection = shift; my $strKey = shift; my $strSubKey = shift; my $oValue = shift; # Parameter constraints if (!(defined($strSection) && defined($strKey))) { confess &log(ASSERT, 'strSection and strKey are required'); } my $oCurrentValue; if (defined($strSubKey)) { $oCurrentValue = \$self->{oContent}{$strSection}{$strKey}{$strSubKey}; } else { $oCurrentValue = \$self->{oContent}{$strSection}{$strKey}; } if (!defined($$oCurrentValue) || defined($oCurrentValue) != defined($oValue) || ${dclone($oCurrentValue)} ne ${dclone(\$oValue)}) { $$oCurrentValue = $oValue; if (!$self->{bModified}) { $self->{bModified} = true; } return true; } return false; } #################################################################################################################################### # boolSet - set a boolean value. #################################################################################################################################### sub boolSet { my $self = shift; my $strSection = shift; my $strKey = shift; my $strSubKey = shift; my $bValue = shift; $self->set($strSection, $strKey, $strSubKey, $bValue ? INI_TRUE : INI_FALSE); } #################################################################################################################################### # numericSet - set a numeric value. #################################################################################################################################### sub numericSet { my $self = shift; my $strSection = shift; my $strKey = shift; my $strSubKey = shift; my $nValue = shift; $self->set($strSection, $strKey, $strSubKey, defined($nValue) ? $nValue + 0 : undef); } #################################################################################################################################### # remove - remove a value. #################################################################################################################################### sub remove { my $self = shift; my $strSection = shift; my $strKey = shift; my $strSubKey = shift; # Test if the value exists if ($self->test($strSection, $strKey, $strSubKey)) { # Remove a subkey if (defined($strSubKey)) { delete($self->{oContent}{$strSection}{$strKey}{$strSubKey}); } # Remove a key if (defined($strKey)) { if (!defined($strSubKey)) { delete($self->{oContent}{$strSection}{$strKey}); } # Remove the section if it is now empty if (keys(%{$self->{oContent}{$strSection}}) == 0) { delete($self->{oContent}{$strSection}); } } # Remove a section if (!defined($strKey)) { delete($self->{oContent}{$strSection}); } # Record changes if (!$self->{bModified}) { $self->{bModified} = true; } return true; } return false; } #################################################################################################################################### # keys - get the list of keys in a section. #################################################################################################################################### sub keys { my $self = shift; my $strSection = shift; my $strSortOrder = shift; if ($self->test($strSection)) { if (!defined($strSortOrder) || $strSortOrder eq INI_SORT_FORWARD) { return (sort(keys(%{$self->get($strSection)}))); } elsif ($strSortOrder eq INI_SORT_REVERSE) { return (sort {$b cmp $a} (keys(%{$self->get($strSection)}))); } elsif ($strSortOrder eq INI_SORT_NONE) { return (keys(%{$self->get($strSection)})); } else { confess &log(ASSERT, "invalid strSortOrder '${strSortOrder}'"); } } my @stryEmptyArray; return @stryEmptyArray; } #################################################################################################################################### # test - test a value. # # Test a value to see if it equals the supplied test value. If no test value is given, tests that the section, key, or subkey # is defined. #################################################################################################################################### sub test { my $self = shift; my $strSection = shift; my $strValue = shift; my $strSubValue = shift; my $strTest = shift; # Get the value my $strResult = $self->get($strSection, $strValue, $strSubValue, false); # Is there a result if (defined($strResult)) { # Is there a value to test against? if (defined($strTest)) { # Make sure these are explicit strings or Devel::Cover thinks they are equal if one side is a boolean return ($strResult . '') eq ($strTest . '') ? true : false; } return true; } return false; } #################################################################################################################################### # boolTest - test a boolean value, see test(). #################################################################################################################################### sub boolTest { my $self = shift; my $strSection = shift; my $strValue = shift; my $strSubValue = shift; my $bTest = shift; return $self->test($strSection, $strValue, $strSubValue, defined($bTest) ? ($bTest ? INI_TRUE : INI_FALSE) : undef); } #################################################################################################################################### # cipherPassSub - gets the passphrase (if it exists) used to read/write subsequent files #################################################################################################################################### sub cipherPassSub { my $self = shift; return $self->get(INI_SECTION_CIPHER, INI_KEY_CIPHER_PASS, undef, false); } #################################################################################################################################### # Properties. #################################################################################################################################### sub modified {shift->{bModified}} # Has the data been modified since last load/save? sub exists {shift->{bExists}} # Is the data persisted to file? sub cipherPass {shift->{strCipherPass}} # Return passphrase (will be undef if repo not encrypted) 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Common/Log.pm000066400000000000000000000670351416457663300235370ustar00rootroot00000000000000#################################################################################################################################### # COMMON LOG MODULE #################################################################################################################################### package pgBackRestDoc::Common::Log; use strict; use warnings FATAL => qw(all); use Carp qw(confess longmess); use English '-no_match_vars'; use Exporter qw(import); our @EXPORT = qw(); use Fcntl qw(:DEFAULT :flock); use File::Basename qw(dirname); use Scalar::Util qw(blessed reftype); use Time::HiRes qw(gettimeofday usleep); use pgBackRestDoc::Common::Exception; use pgBackRestDoc::Common::String; #################################################################################################################################### # Boolean constants #################################################################################################################################### use constant true => 1; push @EXPORT, qw(true); use constant false => 0; push @EXPORT, qw(false); #################################################################################################################################### # Log level constants #################################################################################################################################### use constant TRACE => 'TRACE'; push @EXPORT, qw(TRACE); use constant DEBUG => 'DEBUG'; push @EXPORT, qw(DEBUG); use constant DETAIL => 'DETAIL'; push @EXPORT, qw(DETAIL); use constant INFO => 'INFO'; push @EXPORT, qw(INFO); use constant WARN => 'WARN'; push @EXPORT, qw(WARN); use constant PROTOCOL => 'PROTOCOL'; push @EXPORT, qw(PROTOCOL); use constant ERROR => 'ERROR'; push @EXPORT, qw(ERROR); use constant ASSERT => 'ASSERT'; push @EXPORT, qw(ASSERT); use constant OFF => 'OFF'; push @EXPORT, qw(OFF); #################################################################################################################################### # Log levels ranked by severity #################################################################################################################################### my %oLogLevelRank; $oLogLevelRank{TRACE}{rank} = 8; $oLogLevelRank{DEBUG}{rank} = 7; $oLogLevelRank{DETAIL}{rank} = 6; $oLogLevelRank{INFO}{rank} = 5; $oLogLevelRank{WARN}{rank} = 4; $oLogLevelRank{PROTOCOL}{rank} = 3; $oLogLevelRank{ERROR}{rank} = 2; $oLogLevelRank{ASSERT}{rank} = 1; $oLogLevelRank{OFF}{rank} = 0; #################################################################################################################################### # Module globals #################################################################################################################################### my $hLogFile = undef; my $strLogFileCache = undef; my $strLogLevelFile = OFF; my $strLogLevelConsole = OFF; my $strLogLevelStdErr = WARN; my $bLogTimestamp = true; # Size of the process id log field my $iLogProcessSize = 2; # Flags to limit banner printing until there is actual output my $bLogFileExists; my $bLogFileFirst; # Allow log to be globally enabled or disabled with logEnable() and logDisable() my $bLogDisable = 0; # Allow errors to be logged as warnings my $bLogWarnOnError = 0; # Store the last logged error my $oErrorLast; #################################################################################################################################### # logFileSet - set the file messages will be logged to #################################################################################################################################### sub logFileSet { my $oStorage = shift; my $strFile = shift; my $bLogFileFirstParam = shift; # Only open the log file if file logging is enabled if ($strLogLevelFile ne OFF) { $oStorage->pathCreate(dirname($strFile), {strMode => '0750', bIgnoreExists => true, bCreateParent => true}); $strFile .= '.log'; $bLogFileExists = -e $strFile ? true : false; $bLogFileFirst = defined($bLogFileFirstParam) ? $bLogFileFirstParam : false; if (!sysopen($hLogFile, $strFile, O_WRONLY | O_CREAT | O_APPEND, oct('0640'))) { logErrorResult(ERROR_FILE_OPEN, "unable to open log file '${strFile}'", $OS_ERROR); } # Write out anything that was cached before the file was opened if (defined($strLogFileCache)) { logBanner(); syswrite($hLogFile, $strLogFileCache); undef($strLogFileCache); } } } push @EXPORT, qw(logFileSet); #################################################################################################################################### # logBanner # # Output a banner on the first log entry written to a file #################################################################################################################################### sub logBanner { if ($bLogFileFirst) { if ($bLogFileExists) { syswrite($hLogFile, "\n"); } syswrite($hLogFile, "-------------------PROCESS START-------------------\n"); } $bLogFileFirst = false; } #################################################################################################################################### # logLevelSet - set the log level for file and console #################################################################################################################################### sub logLevelSet { my $strLevelFileParam = shift; my $strLevelConsoleParam = shift; my $strLevelStdErrParam = shift; my $bLogTimestampParam = shift; my $iLogProcessMax = shift; if (defined($strLevelFileParam)) { if (!defined($oLogLevelRank{uc($strLevelFileParam)}{rank})) { confess &log(ERROR, "file log level ${strLevelFileParam} does not exist"); } $strLogLevelFile = uc($strLevelFileParam); } if (defined($strLevelConsoleParam)) { if (!defined($oLogLevelRank{uc($strLevelConsoleParam)}{rank})) { confess &log(ERROR, "console log level ${strLevelConsoleParam} does not exist"); } $strLogLevelConsole = uc($strLevelConsoleParam); } if (defined($strLevelStdErrParam)) { if (!defined($oLogLevelRank{uc($strLevelStdErrParam)}{rank})) { confess &log(ERROR, "stdout log level ${strLevelStdErrParam} does not exist"); } $strLogLevelStdErr = uc($strLevelStdErrParam); } if (defined($bLogTimestampParam)) { $bLogTimestamp = $bLogTimestampParam; } if (defined($iLogProcessMax)) { $iLogProcessSize = $iLogProcessMax > 99 ? 3 : 2; } } push @EXPORT, qw(logLevelSet); #################################################################################################################################### # logDisable #################################################################################################################################### sub logDisable { $bLogDisable++; } push @EXPORT, qw(logDisable); #################################################################################################################################### # logEnable #################################################################################################################################### sub logEnable { $bLogDisable--; } push @EXPORT, qw(logEnable); #################################################################################################################################### # logWarnOnErrorDisable #################################################################################################################################### sub logWarnOnErrorDisable { $bLogWarnOnError--; } push @EXPORT, qw(logWarnOnErrorDisable); #################################################################################################################################### # logWarnOnErrorEnable - when an error is thrown, log it as a warning instead #################################################################################################################################### sub logWarnOnErrorEnable { $bLogWarnOnError++; } push @EXPORT, qw(logWarnOnErrorEnable); #################################################################################################################################### # logDebugParam # # Log parameters passed to functions. #################################################################################################################################### use constant DEBUG_PARAM => '()'; sub logDebugParam { my $strFunction = shift; my $oyParamRef = shift; return logDebugProcess($strFunction, DEBUG_PARAM, undef, $oyParamRef, @_); } push @EXPORT, qw(logDebugParam); #################################################################################################################################### # logDebugReturn # # Log values returned from functions. #################################################################################################################################### use constant DEBUG_RETURN => '=>'; sub logDebugReturn { my $strFunction = shift; return logDebugProcess($strFunction, DEBUG_RETURN, undef, undef, @_); } push @EXPORT, qw(logDebugReturn); #################################################################################################################################### # logDebugMisc # # Log misc values and details during execution. #################################################################################################################################### use constant DEBUG_MISC => ''; sub logDebugMisc { my $strFunction = shift; my $strDetail = shift; return logDebugProcess($strFunction, DEBUG_MISC, $strDetail, undef, @_); } push @EXPORT, qw(logDebugMisc); #################################################################################################################################### # logDebugProcess #################################################################################################################################### sub logDebugProcess { my $strFunction = shift; my $strType = shift; my $strDetail = shift; my $oyParamRef = shift; my $iIndex = 0; my $oParamHash = {}; my @oyResult; my $bLogTrace = true; if ($strType eq DEBUG_PARAM) { push @oyResult, $strFunction; } # Process each parameter hash my $oParam = shift; my $bOptionalBlock = false; # Strip the package name off strFunction if it's pgBackRest $strFunction =~ s/^pgBackRest[^\:]*\:\://; while (defined($oParam)) { my $strParamName = $$oParam{name}; my $bParamOptional = defined($oParam->{optional}) && $oParam->{optional}; my $bParamRequired = !defined($oParam->{required}) || $oParam->{required}; my $oValue; # Should the param be redacted? $oParamHash->{$strParamName}{redact} = $oParam->{redact} ? true : false; # If param is optional then the optional block has been entered if ($bParamOptional) { if (defined($oParam->{required})) { confess &log(ASSERT, "cannot define 'required' for optional parameter '${strParamName}'"); } $bParamRequired = false; $bOptionalBlock = true; } # Don't allow non-optional parameters once optional block has started if ($bParamOptional != $bOptionalBlock) { confess &log(ASSERT, "non-optional parameter '${strParamName}' invalid after optional parameters"); } # Push the return value into the return value array if ($strType eq DEBUG_PARAM) { if ($bParamOptional) { $oValue = $$oyParamRef[$iIndex]->{$strParamName}; } else { $oValue = $$oyParamRef[$iIndex]; } if (defined($oValue)) { push(@oyResult, $oValue); } else { push(@oyResult, $${oParam}{default}); $$oParamHash{$strParamName}{default} = true; } $oValue = $oyResult[-1]; if (!defined($oValue) && $bParamRequired) { confess &log(ASSERT, "${strParamName} is required in ${strFunction}"); } } else { if (ref($$oParam{value}) eq 'ARRAY') { if (defined($$oParam{ref}) && $$oParam{ref}) { push(@oyResult, $$oParam{value}); } else { push(@oyResult, @{$$oParam{value}}); } } else { push(@oyResult, $$oParam{value}); } $oValue = $$oParam{value}; } if (!defined($$oParam{log}) || $$oParam{log}) { # If the parameter is a hash but not blessed then represent it as a string # ??? This should go away once the inputs to logDebug can be changed if (ref($oValue) eq 'HASH' && !blessed($oValue)) { $$oParamHash{$strParamName}{value} = '[hash]'; } # Else log the parameter value exactly else { $$oParamHash{$strParamName}{value} = $oValue; } # There are certain return values that it's wasteful to generate debug logging for if (!($strParamName eq 'self') && (!defined($$oParam{trace}) || !$$oParam{trace})) { $bLogTrace = false; } } # Get the next parameter hash $oParam = shift; if (!$bParamOptional) { $iIndex++; } } if (defined($strDetail) && $iIndex == 0) { $bLogTrace = false; } logDebugOut($strFunction, $strType, $strDetail, $oParamHash, $bLogTrace ? TRACE : DEBUG); # If there are one or zero return values then just return a scalar (this will be undef if there are no return values) if (@oyResult == 1) { return $oyResult[0]; } # Else return an array containing return values return @oyResult; } #################################################################################################################################### # logDebugBuild #################################################################################################################################### sub logDebugBuild { my $strValue = shift; my $rResult; # Value is undefined if (!defined($strValue)) { $rResult = \'[undef]'; } # Value is not a ref, but return it as a ref for efficiency elsif (!ref($strValue)) { $rResult = \$strValue; } # Value is a hash elsif (ref($strValue) eq 'HASH') { my $strValueHash; for my $strSubValue (sort(keys(%{$strValue}))) { $strValueHash .= (defined($strValueHash) ? ', ' : '{') . "${strSubValue} => " . ${logDebugBuild($strValue->{$strSubValue})}; } $rResult = \(defined($strValueHash) ? $strValueHash . '}' : '{}'); } # Value is an array elsif (ref($strValue) eq 'ARRAY') { my $strValueArray; for my $strSubValue (@{$strValue}) { $strValueArray .= (defined($strValueArray) ? ', ' : '(') . ${logDebugBuild($strSubValue)}; } $rResult = \(defined($strValueArray) ? $strValueArray . ')' : '()'); } # Else some other type ??? For the moment this is forced to object to not make big log changes else { $rResult = \('[object]'); } return $rResult; } push @EXPORT, qw(logDebugBuild); #################################################################################################################################### # logDebugOut #################################################################################################################################### use constant DEBUG_STRING_MAX_LEN => 1024; sub logDebugOut { my $strFunction = shift; my $strType = shift; my $strMessage = shift; my $oParamHash = shift; my $strLevel = shift; $strLevel = defined($strLevel) ? $strLevel : DEBUG; if ($oLogLevelRank{$strLevel}{rank} <= $oLogLevelRank{$strLogLevelConsole}{rank} || $oLogLevelRank{$strLevel}{rank} <= $oLogLevelRank{$strLogLevelFile}{rank} || $oLogLevelRank{$strLevel}{rank} <= $oLogLevelRank{$strLogLevelStdErr}{rank}) { if (defined($oParamHash)) { my $strParamSet; foreach my $strParam (sort(keys(%$oParamHash))) { if (defined($strParamSet)) { $strParamSet .= ', '; } my $strValueRef = defined($oParamHash->{$strParam}{value}) ? logDebugBuild($oParamHash->{$strParam}{value}) : undef; my $bDefault = defined($$strValueRef) && defined($$oParamHash{$strParam}{default}) ? $$oParamHash{$strParam}{default} : false; $strParamSet .= "${strParam} = " . ($oParamHash->{$strParam}{redact} && defined($$strValueRef) ? '' : ($bDefault ? '<' : '') . (defined($$strValueRef) ? ($strParam =~ /^(b|is)/ ? ($$strValueRef ? 'true' : 'false'): (length($$strValueRef) > DEBUG_STRING_MAX_LEN ? substr($$strValueRef, 0, DEBUG_STRING_MAX_LEN) . ' ... ': $$strValueRef)) : '[undef]') . ($bDefault ? '>' : '')); } if (defined($strMessage)) { $strMessage = $strMessage . (defined($strParamSet) ? ": ${strParamSet}" : ''); } else { $strMessage = $strParamSet; } } &log($strLevel, "${strFunction}${strType}" . (defined($strMessage) ? ": $strMessage" : '')); } } #################################################################################################################################### # logException #################################################################################################################################### sub logException { my $oException = shift; return &log($oException->level(), $oException->message(), $oException->code(), undef, undef, undef, $oException->extra()); } push @EXPORT, qw(logException); #################################################################################################################################### # logErrorResult #################################################################################################################################### sub logErrorResult { my $iCode = shift; my $strMessage = shift; my $strResult = shift; confess &log(ERROR, $strMessage . (defined($strResult) ? ': ' . trim($strResult) : ''), $iCode); } push @EXPORT, qw(logErrorResult); #################################################################################################################################### # LOG - log messages #################################################################################################################################### sub log { my $strLevel = shift; my $strMessage = shift; my $iCode = shift; my $bSuppressLog = shift; my $iIndent = shift; my $iProcessId = shift; my $rExtra = shift; # Set defaults $bSuppressLog = defined($bSuppressLog) ? $bSuppressLog : false; # Initialize rExtra if (!defined($rExtra)) { $rExtra = { bLogFile => false, bLogConsole => false, }; } # Set operational variables my $strMessageFormat = $strMessage; my $iLogLevelRank = $oLogLevelRank{$strLevel}{rank}; # Level rank must be valid if (!defined($iLogLevelRank)) { confess &log(ASSERT, "log level ${strLevel} does not exist"); } # If message was undefined then set default message if (!defined($strMessageFormat)) { $strMessageFormat = '(undefined)'; } # Set the error code if ($strLevel eq ASSERT) { $iCode = ERROR_ASSERT; } elsif ($strLevel eq ERROR && !defined($iCode)) { $iCode = ERROR_UNKNOWN; } $strMessageFormat = (defined($iCode) ? sprintf('[%03d]: ', $iCode) : '') . $strMessageFormat; # Indent subsequent lines of the message if it has more than one line - makes the log more readable if (defined($iIndent)) { my $strIndent = ' ' x $iIndent; $strMessageFormat =~ s/\n/\n${strIndent}/g; } else { # Indent subsequent message lines so they align $bLogTimestamp ? $strMessageFormat =~ s/\n/\n /g : $strMessageFormat =~ s/\n/\n /g } # Indent TRACE and debug levels so they are distinct from normal messages if ($strLevel eq TRACE) { $strMessageFormat =~ s/\n/\n /g; $strMessageFormat = ' ' . $strMessageFormat; } elsif ($strLevel eq DEBUG) { $strMessageFormat =~ s/\n/\n /g; $strMessageFormat = ' ' . $strMessageFormat; } # Format the message text my ($sec, $min, $hour, $mday, $mon, $year, $wday, $yday, $isdst) = localtime(time); # If logging warnings as errors then change the display level and rank. These will be used to determine if the message will be # displayed or not. my $strDisplayLevel = ($bLogWarnOnError && $strLevel eq ERROR ? WARN : $strLevel); my $iLogDisplayLevelRank = ($bLogWarnOnError && $strLevel eq ERROR ? $oLogLevelRank{$strDisplayLevel}{rank} : $iLogLevelRank); $strMessageFormat = ($bLogTimestamp ? timestampFormat() . sprintf('.%03d ', (gettimeofday() - int(gettimeofday())) * 1000) : '') . sprintf('P%0*d', $iLogProcessSize, defined($iProcessId) ? $iProcessId : 0) . (' ' x (7 - length($strDisplayLevel))) . "${strDisplayLevel}: ${strMessageFormat}\n"; # Skip output if disabled if (!$bLogDisable) { # Output to stderr if configured log level setting rank is greater than the display level rank. if (!$rExtra->{bLogConsole} && $iLogDisplayLevelRank <= $oLogLevelRank{$strLogLevelStdErr}{rank}) { if ($strLogLevelStdErr ne PROTOCOL) { syswrite(*STDERR, $strDisplayLevel . (defined($iCode) ? sprintf(' [%03d]: ', $iCode) : '') . ': '); } syswrite(*STDERR, "${strMessage}\n"); $rExtra->{bLogConsole} = true; } # Else output to stdout if configured log level setting rank is greater than the display level rank elsif (!$rExtra->{bLogConsole} && $iLogDisplayLevelRank <= $oLogLevelRank{$strLogLevelConsole}{rank}) { if (!$bSuppressLog) { syswrite(*STDOUT, $strMessageFormat); # This is here for debugging purposes - it's not clear how best to make it into a switch # if ($strLevel eq ASSERT || $strLevel eq ERROR) # { # my $strStackTrace = longmess() . "\n"; # $strStackTrace =~ s/\n/\n /g; # syswrite(*STDOUT, $strStackTrace); # } } $rExtra->{bLogConsole} = true; } # Output to file if configured log level setting rank is greater than the display level rank or test flag is set. if (!$rExtra->{bLogLogFile} && $iLogDisplayLevelRank <= $oLogLevelRank{$strLogLevelFile}{rank}) { if (defined($hLogFile) || (defined($strLogLevelFile) && $strLogLevelFile ne OFF)) { if (!$bSuppressLog) { if (defined($hLogFile)) { logBanner(); syswrite($hLogFile, $strMessageFormat); } else { $strLogFileCache .= $strMessageFormat; } if ($strDisplayLevel eq ASSERT || ($strDisplayLevel eq ERROR && ($strLogLevelFile eq DEBUG || $strLogLevelFile eq TRACE))) { my $strStackTrace = longmess() . "\n"; $strStackTrace =~ s/\n/\n /g; if (defined($hLogFile)) { syswrite($hLogFile, $strStackTrace); } else { $strLogFileCache .= $strStackTrace; } } } } $rExtra->{bLogFile} = true; } } # Return a typed exception if code is defined if (defined($iCode)) { $oErrorLast = new pgBackRestDoc::Common::Exception($strLevel, $iCode, $strMessage, longmess(), $rExtra); return $oErrorLast; } # Return the message so it can be used in a confess return $strMessage; } push @EXPORT, qw(log); #################################################################################################################################### # logErrorLast - get the last logged error #################################################################################################################################### sub logErrorLast { return $oErrorLast; } push @EXPORT, qw(logErrorLast); #################################################################################################################################### # logLevel - get the current log levels #################################################################################################################################### sub logLevel { return ($strLogLevelFile, $strLogLevelConsole, $strLogLevelStdErr, $bLogTimestamp); } push @EXPORT, qw(logLevel); #################################################################################################################################### # logFileCacheClear - Clear the log file cache #################################################################################################################################### sub logFileCacheClear { undef($strLogFileCache); } push @EXPORT, qw(logFileCacheClear); #################################################################################################################################### # logFileCache - Get the log file cache #################################################################################################################################### sub logFileCache { return $strLogFileCache; } push @EXPORT, qw(logFileCache); 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Common/String.pm000066400000000000000000000063171416457663300242600ustar00rootroot00000000000000#################################################################################################################################### # COMMON STRING MODULE #################################################################################################################################### package pgBackRestDoc::Common::String; use strict; use warnings FATAL => qw(all); use Carp qw(confess longmess); use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname); #################################################################################################################################### # trim # # Trim whitespace. #################################################################################################################################### sub trim { my $strBuffer = shift; if (!defined($strBuffer)) { return; } $strBuffer =~ s/^\s+|\s+$//g; return $strBuffer; } push @EXPORT, qw(trim); #################################################################################################################################### # coalesce - return first defined parameter #################################################################################################################################### sub coalesce { foreach my $strParam (@_) { if (defined($strParam)) { return $strParam; } } return; } push @EXPORT, qw(coalesce); #################################################################################################################################### # timestampFormat # # Get standard timestamp format (or formatted as specified). #################################################################################################################################### sub timestampFormat { my $strFormat = shift; my $lTime = shift; if (!defined($strFormat)) { $strFormat = '%4d-%02d-%02d %02d:%02d:%02d'; } if (!defined($lTime)) { $lTime = time(); } my ($iSecond, $iMinute, $iHour, $iMonthDay, $iMonth, $iYear, $iWeekDay, $iYearDay, $bIsDst) = localtime($lTime); if ($strFormat eq "%4d") { return sprintf($strFormat, $iYear + 1900) } else { return sprintf($strFormat, $iYear + 1900, $iMonth + 1, $iMonthDay, $iHour, $iMinute, $iSecond); } } push @EXPORT, qw(timestampFormat); #################################################################################################################################### # stringSplit #################################################################################################################################### sub stringSplit { my $strString = shift; my $strChar = shift; my $iLength = shift; if (length($strString) <= $iLength) { return $strString, undef; } my $iPos = index($strString, $strChar); if ($iPos == -1) { return $strString, undef; } my $iNewPos = $iPos; while ($iNewPos != -1 && $iNewPos + 1 < $iLength) { $iPos = $iNewPos; $iNewPos = index($strString, $strChar, $iPos + 1); } return substr($strString, 0, $iPos + 1), substr($strString, $iPos + 1); } push @EXPORT, qw(stringSplit); 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Custom/000077500000000000000000000000001416457663300224675ustar00rootroot00000000000000pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Custom/DocConfigData.pm000066400000000000000000000621061416457663300254570ustar00rootroot00000000000000#################################################################################################################################### # Configuration Definition Data # # The configuration is defined in src/build/config/config.yaml, which also contains the documentation. #################################################################################################################################### package pgBackRestDoc::Custom::DocConfigData; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use Cwd qw(abs_path); use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname basename); use Getopt::Long qw(GetOptions); use Storable qw(dclone); use pgBackRestDoc::Common::Exception; use pgBackRestDoc::Common::Log; use pgBackRestDoc::ProjectInfo; use pgBackRestTest::Common::Wait; #################################################################################################################################### # Command constants #################################################################################################################################### use constant CFGCMD_BACKUP => 'backup'; push @EXPORT, qw(CFGCMD_BACKUP); use constant CFGCMD_HELP => 'help'; push @EXPORT, qw(CFGCMD_HELP); use constant CFGCMD_INFO => 'info'; push @EXPORT, qw(CFGCMD_INFO); use constant CFGCMD_VERSION => 'version'; #################################################################################################################################### # Command role constants - roles allowed for each command. Commands may have multiple processes that work together to implement # their functionality. These roles allow each process to know what it is supposed to do. #################################################################################################################################### # Called directly by the user. This is the main process of the command that may or may not spawn other command roles. use constant CFGCMD_ROLE_MAIN => 'main'; push @EXPORT, qw(CFGCMD_ROLE_MAIN); # Async worker that is spawned so the main process can return a result while work continues. An async worker may spawn local or # remote workers. use constant CFGCMD_ROLE_ASYNC => 'async'; push @EXPORT, qw(CFGCMD_ROLE_ASYNC); # Local worker for parallelizing jobs. A local work may spawn a remote worker. use constant CFGCMD_ROLE_LOCAL => 'local'; push @EXPORT, qw(CFGCMD_ROLE_LOCAL); # Remote worker for accessing resources on another host use constant CFGCMD_ROLE_REMOTE => 'remote'; push @EXPORT, qw(CFGCMD_ROLE_REMOTE); #################################################################################################################################### # Option constants - options that are allowed for commands #################################################################################################################################### # Command-line only options #----------------------------------------------------------------------------------------------------------------------------------- use constant CFGOPT_CONFIG => 'config'; push @EXPORT, qw(CFGOPT_CONFIG); use constant CFGOPT_STANZA => 'stanza'; push @EXPORT, qw(CFGOPT_STANZA); # Command-line only local/remote options #----------------------------------------------------------------------------------------------------------------------------------- # Paths use constant CFGOPT_LOCK_PATH => 'lock-path'; push @EXPORT, qw(CFGOPT_LOCK_PATH); use constant CFGOPT_LOG_PATH => 'log-path'; push @EXPORT, qw(CFGOPT_LOG_PATH); use constant CFGOPT_SPOOL_PATH => 'spool-path'; push @EXPORT, qw(CFGOPT_SPOOL_PATH); # Logging use constant CFGOPT_LOG_LEVEL_STDERR => 'log-level-stderr'; push @EXPORT, qw(CFGOPT_LOG_LEVEL_STDERR); use constant CFGOPT_LOG_TIMESTAMP => 'log-timestamp'; push @EXPORT, qw(CFGOPT_LOG_TIMESTAMP); # Repository options #----------------------------------------------------------------------------------------------------------------------------------- # Prefix that must be used by all repo options that allow multiple configurations use constant CFGDEF_PREFIX_REPO => 'repo'; # Repository General use constant CFGOPT_REPO_PATH => CFGDEF_PREFIX_REPO . '-path'; push @EXPORT, qw(CFGOPT_REPO_PATH); # Repository Host use constant CFGOPT_REPO_HOST => CFGDEF_PREFIX_REPO . '-host'; use constant CFGOPT_REPO_HOST_CMD => CFGOPT_REPO_HOST . '-cmd'; push @EXPORT, qw(CFGOPT_REPO_HOST_CMD); # Stanza options #----------------------------------------------------------------------------------------------------------------------------------- # Determines how many databases can be configured use constant CFGDEF_INDEX_PG => 8; push @EXPORT, qw(CFGDEF_INDEX_PG); # Prefix that must be used by all db options that allow multiple configurations use constant CFGDEF_PREFIX_PG => 'pg'; push @EXPORT, qw(CFGDEF_PREFIX_PG); # Set default PostgreSQL cluster use constant CFGOPT_PG_HOST => CFGDEF_PREFIX_PG . '-host'; use constant CFGOPT_PG_HOST_CMD => CFGOPT_PG_HOST . '-cmd'; push @EXPORT, qw(CFGOPT_PG_HOST_CMD); #################################################################################################################################### # Option definition constants - defines, types, sections, etc. #################################################################################################################################### # Command defines #----------------------------------------------------------------------------------------------------------------------------------- use constant CFGDEF_LOG_FILE => 'log-file'; push @EXPORT, qw(CFGDEF_LOG_FILE); use constant CFGDEF_LOG_LEVEL_DEFAULT => 'log-level-default'; push @EXPORT, qw(CFGDEF_LOG_LEVEL_DEFAULT); use constant CFGDEF_LOCK_REQUIRED => 'lock-required'; push @EXPORT, qw(CFGDEF_LOCK_REQUIRED); use constant CFGDEF_LOCK_REMOTE_REQUIRED => 'lock-remote-required'; push @EXPORT, qw(CFGDEF_LOCK_REMOTE_REQUIRED); use constant CFGDEF_LOCK_TYPE => 'lock-type'; push @EXPORT, qw(CFGDEF_LOCK_TYPE); use constant CFGDEF_LOCK_TYPE_NONE => 'none'; use constant CFGDEF_PARAMETER_ALLOWED => 'parameter-allowed'; push @EXPORT, qw(CFGDEF_PARAMETER_ALLOWED); # Option defines #----------------------------------------------------------------------------------------------------------------------------------- use constant CFGDEF_ALLOW_LIST => 'allow-list'; push @EXPORT, qw(CFGDEF_ALLOW_LIST); use constant CFGDEF_ALLOW_RANGE => 'allow-range'; push @EXPORT, qw(CFGDEF_ALLOW_RANGE); use constant CFGDEF_DEFAULT => 'default'; push @EXPORT, qw(CFGDEF_DEFAULT); use constant CFGDEF_DEFAULT_LITERAL => 'default-literal'; push @EXPORT, qw(CFGDEF_DEFAULT_LITERAL); # Group options together to share common configuration use constant CFGDEF_GROUP => 'group'; push @EXPORT, qw(CFGDEF_GROUP); use constant CFGDEF_INDEX => 'index'; push @EXPORT, qw(CFGDEF_INDEX); use constant CFGDEF_INHERIT => 'inherit'; push @EXPORT, qw(CFGDEF_INHERIT); use constant CFGDEF_INTERNAL => 'internal'; push @EXPORT, qw(CFGDEF_INTERNAL); use constant CFGDEF_DEPRECATE => 'deprecate'; push @EXPORT, qw(CFGDEF_DEPRECATE); use constant CFGDEF_NEGATE => 'negate'; push @EXPORT, qw(CFGDEF_NEGATE); use constant CFGDEF_COMMAND => 'command'; push @EXPORT, qw(CFGDEF_COMMAND); use constant CFGDEF_COMMAND_ROLE => 'command-role'; push @EXPORT, qw(CFGDEF_COMMAND_ROLE); use constant CFGDEF_REQUIRED => 'required'; push @EXPORT, qw(CFGDEF_REQUIRED); use constant CFGDEF_RESET => 'reset'; push @EXPORT, qw(CFGDEF_RESET); use constant CFGDEF_SECTION => 'section'; push @EXPORT, qw(CFGDEF_SECTION); use constant CFGDEF_SECURE => 'secure'; push @EXPORT, qw(CFGDEF_SECURE); use constant CFGDEF_TYPE => 'type'; push @EXPORT, qw(CFGDEF_TYPE); # Option types #----------------------------------------------------------------------------------------------------------------------------------- use constant CFGDEF_TYPE_BOOLEAN => 'boolean'; push @EXPORT, qw(CFGDEF_TYPE_BOOLEAN); use constant CFGDEF_TYPE_HASH => 'hash'; push @EXPORT, qw(CFGDEF_TYPE_HASH); use constant CFGDEF_TYPE_INTEGER => 'integer'; push @EXPORT, qw(CFGDEF_TYPE_INTEGER); use constant CFGDEF_TYPE_LIST => 'list'; push @EXPORT, qw(CFGDEF_TYPE_LIST); use constant CFGDEF_TYPE_PATH => 'path'; push @EXPORT, qw(CFGDEF_TYPE_PATH); use constant CFGDEF_TYPE_STRING => 'string'; push @EXPORT, qw(CFGDEF_TYPE_STRING); use constant CFGDEF_TYPE_SIZE => 'size'; push @EXPORT, qw(CFGDEF_TYPE_SIZE); use constant CFGDEF_TYPE_TIME => 'time'; push @EXPORT, qw(CFGDEF_TYPE_TIME); # Option config sections #----------------------------------------------------------------------------------------------------------------------------------- use constant CFGDEF_SECTION_GLOBAL => 'global'; push @EXPORT, qw(CFGDEF_SECTION_GLOBAL); use constant CFGDEF_SECTION_STANZA => 'stanza'; push @EXPORT, qw(CFGDEF_SECTION_STANZA); #################################################################################################################################### # Load configuration #################################################################################################################################### use YAML::XS qw(LoadFile); # Required so booleans are not read-only local $YAML::XS::Boolean = "JSON::PP"; my $rhConfig = LoadFile(dirname(dirname($0)) . '/src/build/config/config.yaml'); my $rhCommandDefine = $rhConfig->{'command'}; my $rhOptionGroupDefine = $rhConfig->{'optionGroup'}; my $rhConfigDefine = $rhConfig->{'option'}; #################################################################################################################################### # Fix errors introduced by YAML::XS::LoadFile. This is typically fixed by setting local $YAML::XS::Boolean = "JSON::PP", but older # Debian/Ubuntu versions do not support this fix. Some booleans get set read only and others also end up as empty strings. There is # no apparent pattern to what gets broken so it is important to be on the lookout for strange output when adding new options. # # ??? For now this code is commented out since packages for older Debians can be built using backports. It is being preserved just # in case it is needed before the migration to C is complete. #################################################################################################################################### # sub optionDefineFixup # { # my $strKey = shift; # my $rhDefine = shift; # # # Fix read-only required values so they are writable # if (defined($rhDefine->{&CFGDEF_REQUIRED})) # { # my $value = $rhDefine->{&CFGDEF_REQUIRED} ? true : false; # delete($rhDefine->{&CFGDEF_REQUIRED}); # $rhDefine->{&CFGDEF_REQUIRED} = $value; # } # # # If the default is an empty string set to false. This must be a mangled boolean since empty strings are not valid defaults. # if (defined($rhDefine->{&CFGDEF_DEFAULT}) && $rhDefine->{&CFGDEF_DEFAULT} eq '') # { # delete($rhDefine->{&CFGDEF_DEFAULT}); # $rhDefine->{&CFGDEF_DEFAULT} = false; # } # } # # # Fix all options # foreach my $strKey (sort(keys(%{$rhConfigDefine}))) # { # my $rhOption = $rhConfigDefine->{$strKey}; # optionDefineFixup($strKey, $rhOption); # # # Fix all option commands # if (ref($rhOption->{&CFGDEF_COMMAND})) # { # foreach my $strCommand (sort(keys(%{$rhOption->{&CFGDEF_COMMAND}}))) # { # optionDefineFixup("$strKey-$strCommand", $rhOption->{&CFGDEF_COMMAND}{$strCommand}); # } # } # } #################################################################################################################################### # Process command define defaults #################################################################################################################################### foreach my $strCommand (sort(keys(%{$rhCommandDefine}))) { # Commands are external by default if (!defined($rhCommandDefine->{$strCommand}{&CFGDEF_INTERNAL})) { $rhCommandDefine->{$strCommand}{&CFGDEF_INTERNAL} = false; } # Log files are created by default if (!defined($rhCommandDefine->{$strCommand}{&CFGDEF_LOG_FILE})) { $rhCommandDefine->{$strCommand}{&CFGDEF_LOG_FILE} = true; } # Default log level is INFO if (!defined($rhCommandDefine->{$strCommand}{&CFGDEF_LOG_LEVEL_DEFAULT})) { $rhCommandDefine->{$strCommand}{&CFGDEF_LOG_LEVEL_DEFAULT} = INFO; } # Default lock required is false if (!defined($rhCommandDefine->{$strCommand}{&CFGDEF_LOCK_REQUIRED})) { $rhCommandDefine->{$strCommand}{&CFGDEF_LOCK_REQUIRED} = false; } # Default lock remote required is false if (!defined($rhCommandDefine->{$strCommand}{&CFGDEF_LOCK_REMOTE_REQUIRED})) { $rhCommandDefine->{$strCommand}{&CFGDEF_LOCK_REMOTE_REQUIRED} = false; } # Lock type must be set if a lock is required if (!defined($rhCommandDefine->{$strCommand}{&CFGDEF_LOCK_TYPE})) { # Is a lock type required? if ($rhCommandDefine->{$strCommand}{&CFGDEF_LOCK_REQUIRED}) { confess &log(ERROR, "lock type is required for command '${strCommand}'"); } $rhCommandDefine->{$strCommand}{&CFGDEF_LOCK_TYPE} = CFGDEF_LOCK_TYPE_NONE; } else { if ($rhCommandDefine->{$strCommand}{&CFGDEF_LOCK_REQUIRED} && $rhCommandDefine->{$strCommand}{&CFGDEF_LOCK_TYPE} eq CFGDEF_LOCK_TYPE_NONE) { confess &log(ERROR, "lock type is required for command '${strCommand}' and cannot be 'none'"); } } # Default parameter allowed is false if (!defined($rhCommandDefine->{$strCommand}{&CFGDEF_PARAMETER_ALLOWED})) { $rhCommandDefine->{$strCommand}{&CFGDEF_PARAMETER_ALLOWED} = false; } # All commands have the default role if (!defined($rhCommandDefine->{$strCommand}{&CFGDEF_COMMAND_ROLE}{&CFGCMD_ROLE_MAIN})) { $rhCommandDefine->{$strCommand}{&CFGDEF_COMMAND_ROLE}{&CFGCMD_ROLE_MAIN} = {}; } } #################################################################################################################################### # Process option define defaults #################################################################################################################################### foreach my $strKey (sort(keys(%{$rhConfigDefine}))) { my $rhOption = $rhConfigDefine->{$strKey}; # If the define is a scalar then copy the entire define from the referenced option if (defined($rhConfigDefine->{$strKey}{&CFGDEF_INHERIT})) { # Make a copy in case there are overrides that need to be applied after inheriting my $hConfigDefineOverride = dclone($rhConfigDefine->{$strKey}); # Copy the option being inherited from $rhConfigDefine->{$strKey} = dclone($rhConfigDefine->{$rhConfigDefine->{$strKey}{&CFGDEF_INHERIT}}); # No need to copy the inheritance key delete($rhConfigDefine->{$strKey}{&CFGDEF_INHERIT}); # It makes no sense to inherit deprecations - they must be specified for each option delete($rhConfigDefine->{$strKey}{&CFGDEF_DEPRECATE}); # Apply overrides foreach my $strOptionDef (sort(keys(%{$hConfigDefineOverride}))) { $rhConfigDefine->{$strKey}{$strOptionDef} = $hConfigDefineOverride->{$strOptionDef}; } # Update option variable with new hash reference $rhOption = $rhConfigDefine->{$strKey} } # If command is not specified then the option is valid for all commands except version and help if (!defined($rhOption->{&CFGDEF_COMMAND})) { foreach my $strCommand (sort(keys(%{$rhCommandDefine}))) { next if $strCommand eq CFGCMD_HELP || $strCommand eq CFGCMD_VERSION; $rhOption->{&CFGDEF_COMMAND}{$strCommand} = {}; } } # Else if the command section is a scalar then copy the section from the referenced option elsif (defined($rhConfigDefine->{$strKey}{&CFGDEF_COMMAND}) && !ref($rhConfigDefine->{$strKey}{&CFGDEF_COMMAND})) { $rhConfigDefine->{$strKey}{&CFGDEF_COMMAND} = dclone($rhConfigDefine->{$rhConfigDefine->{$strKey}{&CFGDEF_COMMAND}}{&CFGDEF_COMMAND}); } # If the allow list is a scalar then copy the list from the referenced option if (defined($rhConfigDefine->{$strKey}{&CFGDEF_ALLOW_LIST}) && !ref($rhConfigDefine->{$strKey}{&CFGDEF_ALLOW_LIST})) { $rhConfigDefine->{$strKey}{&CFGDEF_ALLOW_LIST} = dclone($rhConfigDefine->{$rhConfigDefine->{$strKey}{&CFGDEF_ALLOW_LIST}}{&CFGDEF_ALLOW_LIST}); } # Default type is string if (!defined($rhConfigDefine->{$strKey}{&CFGDEF_TYPE})) { &log(ASSERT, "type is required for option '${strKey}'"); } # Default required is true if (!defined($rhConfigDefine->{$strKey}{&CFGDEF_REQUIRED})) { $rhConfigDefine->{$strKey}{&CFGDEF_REQUIRED} = true; } # Default internal is false if (!defined($rhConfigDefine->{$strKey}{&CFGDEF_INTERNAL})) { $rhConfigDefine->{$strKey}{&CFGDEF_INTERNAL} = false; } # All boolean config options can be negated. Boolean command-line options must be marked for negation individually. if ($rhConfigDefine->{$strKey}{&CFGDEF_TYPE} eq CFGDEF_TYPE_BOOLEAN && defined($rhConfigDefine->{$strKey}{&CFGDEF_SECTION})) { $rhConfigDefine->{$strKey}{&CFGDEF_NEGATE} = true; } # Default for negation is false if (!defined($rhConfigDefine->{$strKey}{&CFGDEF_NEGATE})) { $rhConfigDefine->{$strKey}{&CFGDEF_NEGATE} = false; } # All config options can be reset if (defined($rhConfigDefine->{$strKey}{&CFGDEF_SECTION})) { $rhConfigDefine->{$strKey}{&CFGDEF_RESET} = true; } elsif (!defined($rhConfigDefine->{$strKey}{&CFGDEF_RESET})) { $rhConfigDefine->{$strKey}{&CFGDEF_RESET} = false; } # By default options are not secure if (!defined($rhConfigDefine->{$strKey}{&CFGDEF_SECURE})) { $rhConfigDefine->{$strKey}{&CFGDEF_SECURE} = false; } # All int, size and time options must have an allow range if (($rhConfigDefine->{$strKey}{&CFGDEF_TYPE} eq CFGDEF_TYPE_INTEGER || $rhConfigDefine->{$strKey}{&CFGDEF_TYPE} eq CFGDEF_TYPE_TIME || $rhConfigDefine->{$strKey}{&CFGDEF_TYPE} eq CFGDEF_TYPE_SIZE) && !(defined($rhConfigDefine->{$strKey}{&CFGDEF_ALLOW_RANGE}) || defined($rhConfigDefine->{$strKey}{&CFGDEF_ALLOW_LIST}))) { confess &log(ASSERT, "int/size/time option '${strKey}' must have allow range or list"); } # Ensure all commands are valid foreach my $strCommand (sort(keys(%{$rhConfigDefine->{$strKey}{&CFGDEF_COMMAND}}))) { if (!defined($rhCommandDefine->{$strCommand})) { confess &log(ASSERT, "invalid command '${strCommand}'"); } } } # Generate valid command roles for each option foreach my $strOption (sort(keys(%{$rhConfigDefine}))) { my $rhOption = $rhConfigDefine->{$strOption}; # Generate valid command roles for each command in the option foreach my $strCommand (sort(keys(%{$rhOption->{&CFGDEF_COMMAND}}))) { # If command roles are defined in the option command override then check that they are valid if (defined($rhOption->{&CFGDEF_COMMAND}{$strCommand}{&CFGDEF_COMMAND_ROLE})) { foreach my $strCommandRole (sort(keys(%{$rhOption->{&CFGDEF_COMMAND}{$strCommand}{&CFGDEF_COMMAND_ROLE}}))) { if (!defined($rhCommandDefine->{$strCommand}{&CFGDEF_COMMAND_ROLE}{$strCommandRole})) { confess &log( ASSERT, "option '${strOption}', command '${strCommand}' has invalid command role '${strCommandRole}'"); } } } # Else if the option has command roles defined then use the intersection of command roles with the command elsif (defined($rhOption->{&CFGDEF_COMMAND_ROLE})) { foreach my $strCommandRole (sort(keys(%{$rhOption->{&CFGDEF_COMMAND_ROLE}}))) { if (defined($rhCommandDefine->{$strCommand}{&CFGDEF_COMMAND_ROLE}{$strCommandRole})) { $rhOption->{&CFGDEF_COMMAND}{$strCommand}{&CFGDEF_COMMAND_ROLE}{$strCommandRole} = {}; } } } # Else copy the command roles from the command else { foreach my $strCommandRole (sort(keys(%{$rhCommandDefine->{$strCommand}{&CFGDEF_COMMAND_ROLE}}))) { $rhOption->{&CFGDEF_COMMAND}{$strCommand}{&CFGDEF_COMMAND_ROLE}{$strCommandRole} = {}; } } } # Remove option command roles so they don't accidentally get used in processing (since they were copied to option commands) delete($rhOption->{&CFGDEF_COMMAND_ROLE}); } #################################################################################################################################### # Get option definition #################################################################################################################################### sub cfgDefine { return dclone($rhConfigDefine); } push @EXPORT, qw(cfgDefine); #################################################################################################################################### # Get command definition #################################################################################################################################### sub cfgDefineCommand { return dclone($rhCommandDefine); } push @EXPORT, qw(cfgDefineCommand); #################################################################################################################################### # Get option group definition #################################################################################################################################### sub cfgDefineOptionGroup { return dclone($rhOptionGroupDefine); } push @EXPORT, qw(cfgDefineOptionGroup); #################################################################################################################################### # Get list of all commands #################################################################################################################################### sub cfgDefineCommandList { # Return sorted list return (sort(keys(%{$rhCommandDefine}))); } push @EXPORT, qw(cfgDefineCommandList); #################################################################################################################################### # Get list of all option types #################################################################################################################################### sub cfgDefineOptionTypeList { my $rhOptionTypeMap; # Get unique list of types foreach my $strOption (sort(keys(%{$rhConfigDefine}))) { my $strOptionType = $rhConfigDefine->{$strOption}{&CFGDEF_TYPE}; if (!defined($rhOptionTypeMap->{$strOptionType})) { $rhOptionTypeMap->{$strOptionType} = true; } }; # Return sorted list return (sort(keys(%{$rhOptionTypeMap}))); } push @EXPORT, qw(cfgDefineOptionTypeList); 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Custom/DocCustomRelease.pm000066400000000000000000000665601416457663300262430ustar00rootroot00000000000000#################################################################################################################################### # DOC RELEASE MODULE #################################################################################################################################### package pgBackRestDoc::Custom::DocCustomRelease; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use Cwd qw(abs_path); use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname); use pgBackRestDoc::Common::DocRender; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; use pgBackRestDoc::Custom::DocConfigData; use pgBackRestDoc::ProjectInfo; #################################################################################################################################### # XML node constants #################################################################################################################################### use constant XML_PARAM_ID => 'id'; use constant XML_CONTRIBUTOR_LIST => 'contributor-list'; use constant XML_CONTRIBUTOR => 'contributor'; use constant XML_CONTRIBUTOR_NAME_DISPLAY => 'contributor-name-display'; use constant XML_RELEASE_CORE_LIST => 'release-core-list'; use constant XML_RELEASE_DOC_LIST => 'release-doc-list'; use constant XML_RELEASE_TEST_LIST => 'release-test-list'; use constant XML_RELEASE_BUG_LIST => 'release-bug-list'; use constant XML_RELEASE_DEVELOPMENT_LIST => 'release-development-list'; use constant XML_RELEASE_FEATURE_LIST => 'release-feature-list'; use constant XML_RELEASE_IMPROVEMENT_LIST => 'release-improvement-list'; use constant XML_RELEASE_ITEM_CONTRIBUTOR_LIST => 'release-item-contributor-list'; use constant XML_RELEASE_ITEM_CONTRIBUTOR => 'release-item-contributor'; use constant XML_RELEASE_ITEM_IDEATOR => 'release-item-ideator'; use constant XML_RELEASE_ITEM_REVIEWER => 'release-item-reviewer'; #################################################################################################################################### # Contributor text constants #################################################################################################################################### use constant TEXT_CONTRIBUTED => 'Contributed'; use constant TEXT_FIXED => 'Fixed'; use constant TEXT_FOUND => 'Reported'; use constant TEXT_REVIEWED => 'Reviewed'; use constant TEXT_SUGGESTED => 'Suggested'; #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Create the class hash my $self = {}; bless $self, $class; # Assign function parameters, defaults, and log debug info ( my $strOperation, $self->{oDoc}, $self->{bDev}, ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'oDoc'}, {name => 'bDev', required => false, default => false}, ); # Get contributor list foreach my $oContributor ($self->{oDoc}->nodeGet(XML_CONTRIBUTOR_LIST)->nodeList(XML_CONTRIBUTOR)) { my $strContributorId = $oContributor->paramGet(XML_PARAM_ID); if (!defined($self->{hContributor})) { $self->{hContributor} = {}; $self->{strContributorDefault} = $strContributorId; } ${$self->{hContributor}}{$strContributorId}{name} = $oContributor->fieldGet(XML_CONTRIBUTOR_NAME_DISPLAY); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # currentStableVersion # # Return the current stable version. #################################################################################################################################### sub currentStableVersion { my $self = shift; my $oDoc = $self->{oDoc}; foreach my $oRelease ($oDoc->nodeGet('release-list')->nodeList('release')) { my $strVersion = $oRelease->paramGet('version'); if ($strVersion !~ /dev$/) { return $strVersion; } } confess &log(ERROR, "unable to find non-development version"); } #################################################################################################################################### # releaseLast # # Get the last release. #################################################################################################################################### sub releaseLast { my $self = shift; my $oDoc = $self->{oDoc}; foreach my $oRelease ($oDoc->nodeGet('release-list')->nodeList('release')) { return $oRelease; } } #################################################################################################################################### # contributorTextGet # # Get a list of contributors for an item in text format. #################################################################################################################################### sub contributorTextGet { my $self = shift; my $oReleaseItem = shift; my $strItemType = shift; my $strContributorText; my $hItemContributorType = {}; # Create a the list of contributors foreach my $strContributorType (XML_RELEASE_ITEM_IDEATOR, XML_RELEASE_ITEM_CONTRIBUTOR, XML_RELEASE_ITEM_REVIEWER) { my $stryItemContributor = []; if ($oReleaseItem->nodeTest(XML_RELEASE_ITEM_CONTRIBUTOR_LIST)) { foreach my $oContributor ($oReleaseItem->nodeGet(XML_RELEASE_ITEM_CONTRIBUTOR_LIST)-> nodeList($strContributorType, false)) { push @{$stryItemContributor}, $oContributor->paramGet(XML_PARAM_ID); } } if (@$stryItemContributor == 0 && $strContributorType eq XML_RELEASE_ITEM_CONTRIBUTOR) { push @{$stryItemContributor}, $self->{strContributorDefault} } # Add the default user as a reviewer if there are no reviewers listed and default user is not already a contributor if (@$stryItemContributor == 0 && $strContributorType eq XML_RELEASE_ITEM_REVIEWER) { my $bFound = false; foreach my $strContributor (@{$$hItemContributorType{&XML_RELEASE_ITEM_CONTRIBUTOR}}) { if ($strContributor eq $self->{strContributorDefault}) { $bFound = true; last; } } if (!$bFound) { push @{$stryItemContributor}, $self->{strContributorDefault} } } $$hItemContributorType{$strContributorType} = $stryItemContributor; } # Error if a reviewer is also a contributor foreach my $strReviewer (@{$$hItemContributorType{&XML_RELEASE_ITEM_REVIEWER}}) { foreach my $strContributor (@{$$hItemContributorType{&XML_RELEASE_ITEM_CONTRIBUTOR}}) { if ($strReviewer eq $strContributor) { confess &log(ERROR, "${strReviewer} cannot be both a contributor and a reviewer"); } } } # Error if the ideator list is the same as the contributor list if (join(',', @{$$hItemContributorType{&XML_RELEASE_ITEM_IDEATOR}}) eq join(',', @{$$hItemContributorType{&XML_RELEASE_ITEM_CONTRIBUTOR}})) { confess &log(ERROR, 'cannot have same contributor and ideator list: ' . join(', ', @{$$hItemContributorType{&XML_RELEASE_ITEM_CONTRIBUTOR}})); } # Remove the default user if they are the only one in a group (to prevent the entire page from being splattered with one name) foreach my $strContributorType (XML_RELEASE_ITEM_IDEATOR, XML_RELEASE_ITEM_CONTRIBUTOR) { if (@{$$hItemContributorType{$strContributorType}} == 1 && @{$$hItemContributorType{$strContributorType}}[0] eq $self->{strContributorDefault}) { $$hItemContributorType{$strContributorType} = []; } } # Render the string foreach my $strContributorType (XML_RELEASE_ITEM_CONTRIBUTOR, XML_RELEASE_ITEM_REVIEWER, XML_RELEASE_ITEM_IDEATOR) { my $stryItemContributor = $$hItemContributorType{$strContributorType}; my $strContributorTypeText; foreach my $strContributor (@{$stryItemContributor}) { my $hContributor = ${$self->{hContributor}}{$strContributor}; if (!defined($hContributor)) { confess &log(ERROR, "contributor ${strContributor} does not exist"); } $strContributorTypeText .= (defined($strContributorTypeText) ? ', ' : '') . $$hContributor{name}; } if (defined($strContributorTypeText)) { $strContributorTypeText = ' by ' . $strContributorTypeText . '.'; if ($strContributorType eq XML_RELEASE_ITEM_CONTRIBUTOR) { $strContributorTypeText = ($strItemType eq 'bug' ? TEXT_FIXED : TEXT_CONTRIBUTED) . $strContributorTypeText; } elsif ($strContributorType eq XML_RELEASE_ITEM_IDEATOR) { $strContributorTypeText = ($strItemType eq 'bug' ? TEXT_FOUND : TEXT_SUGGESTED) . $strContributorTypeText; } elsif ($strContributorType eq XML_RELEASE_ITEM_REVIEWER) { $strContributorTypeText = TEXT_REVIEWED . $strContributorTypeText; } $strContributorText .= (defined($strContributorText) ? ' ' : '') . $strContributorTypeText; } } return $strContributorText; } #################################################################################################################################### # Find a commit by subject prefix. Error if the prefix appears more than once. #################################################################################################################################### sub commitFindSubject { my $self = shift; my $rhyCommit = shift; my $strSubjectPrefix = shift; my $bRegExp = shift; $bRegExp = defined($bRegExp) ? $bRegExp : true; my $rhResult = undef; foreach my $rhCommit (@{$rhyCommit}) { if (($bRegExp && $rhCommit->{subject} =~ /^$strSubjectPrefix/) || (!$bRegExp && length($rhCommit->{subject}) >= length($strSubjectPrefix) && substr($rhCommit->{subject}, 0, length($strSubjectPrefix)) eq $strSubjectPrefix)) { if (defined($rhResult)) { confess &log(ERROR, "subject prefix '${strSubjectPrefix}' already found in commit " . $rhCommit->{commit}); } $rhResult = $rhCommit; } } return $rhResult; } #################################################################################################################################### # Throw an error that includes a list of release commits #################################################################################################################################### sub commitError { my $self = shift; my $strMessage = shift; my $rstryCommitRemaining = shift; my $rhyCommit = shift; my $strList; foreach my $strCommit (@{$rstryCommitRemaining}) { $strList .= (defined($strList) ? "\n" : '') . substr($rhyCommit->{$strCommit}{date}, 0, length($rhyCommit->{$strCommit}{date}) - 15) . " $strCommit: " . $rhyCommit->{$strCommit}{subject}; } confess &log(ERROR, "${strMessage}:\n${strList}"); } #################################################################################################################################### # docGet # # Get the xml for release. #################################################################################################################################### sub docGet { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->docGet'); # Load the git history my $oStorageDoc = new pgBackRestTest::Common::Storage( dirname(abs_path($0)), new pgBackRestTest::Common::StoragePosix({bFileSync => false, bPathSync => false})); my @hyGitLog = @{(JSON::PP->new()->allow_nonref())->decode(${$oStorageDoc->get("resource/git-history.cache")})}; # Get renderer my $oRender = new pgBackRestDoc::Common::DocRender('text'); $oRender->tagSet('backrest', PROJECT_NAME); # Create the doc my $oDoc = new pgBackRestDoc::Common::Doc(); $oDoc->paramSet('title', $self->{oDoc}->paramGet('title')); $oDoc->paramSet('toc-number', $self->{oDoc}->paramGet('toc-number')); # Set the description for use as a meta tag $oDoc->fieldSet('description', $self->{oDoc}->fieldGet('description')); # Add the introduction my $oIntroSectionDoc = $oDoc->nodeAdd('section', undef, {id => 'introduction'}); $oIntroSectionDoc->nodeAdd('title')->textSet('Introduction'); $oIntroSectionDoc->textSet($self->{oDoc}->nodeGet('intro')->textGet()); # Add each release section my $oSection; my $iDevReleaseTotal = 0; my $iCurrentReleaseTotal = 0; my $iStableReleaseTotal = 0; my $iUnsupportedReleaseTotal = 0; my @oyRelease = $self->{oDoc}->nodeGet('release-list')->nodeList('release'); for (my $iReleaseIdx = 0; $iReleaseIdx < @oyRelease; $iReleaseIdx++) { my $oRelease = $oyRelease[$iReleaseIdx]; # Get the release version and dev flag my $strVersion = $oRelease->paramGet('version'); my $bReleaseDev = $strVersion =~ /dev$/ ? true : false; # Get a list of commits that apply to this release my @rhyReleaseCommit; my $rhReleaseCommitRemaining; my @stryReleaseCommitRemaining; my $bReleaseCheckCommit = false; if ($strVersion ge '2.01') { # Should commits in the release be checked? $bReleaseCheckCommit = !$bReleaseDev ? true : false; # Get the begin commit my $rhReleaseCommitBegin = $self->commitFindSubject(\@hyGitLog, "Begin v${strVersion} development\\."); my $strReleaseCommitBegin = defined($rhReleaseCommitBegin) ? $rhReleaseCommitBegin->{commit} : undef; # Get the end commit of the last release my $strReleaseLastVersion = $oyRelease[$iReleaseIdx + 1]->paramGet('version'); my $rhReleaseLastCommitEnd = $self->commitFindSubject(\@hyGitLog, "v${strReleaseLastVersion}\\: .+"); if (!defined($rhReleaseLastCommitEnd)) { confess &log(ERROR, "release ${strReleaseLastVersion} must have an end commit"); } my $strReleaseLastCommitEnd = $rhReleaseLastCommitEnd->{commit}; # Get the end commit my $rhReleaseCommitEnd = $self->commitFindSubject(\@hyGitLog, "v${strVersion}\\: .+"); my $strReleaseCommitEnd = defined($rhReleaseCommitEnd) ? $rhReleaseCommitEnd->{commit} : undef; if ($bReleaseCheckCommit && !defined($rhReleaseCommitEnd) && $iReleaseIdx != 0) { confess &log(ERROR, "release ${strVersion} must have an end commit"); } # Make a list of commits for this release while ($hyGitLog[0]->{commit} ne $strReleaseLastCommitEnd) { # Don't add begin/end commits to the list since they are already accounted for if ((defined($strReleaseCommitEnd) && $hyGitLog[0]->{commit} eq $strReleaseCommitEnd) || (defined($strReleaseCommitBegin) && $hyGitLog[0]->{commit} eq $strReleaseCommitBegin)) { shift(@hyGitLog); } # Else add the commit to this releases' list else { push(@stryReleaseCommitRemaining, $hyGitLog[0]->{commit}); push(@rhyReleaseCommit, $hyGitLog[0]); $rhReleaseCommitRemaining->{$hyGitLog[0]->{commit}}{date} = $hyGitLog[0]->{date}; $rhReleaseCommitRemaining->{$hyGitLog[0]->{commit}}{subject} = $hyGitLog[0]->{subject}; shift(@hyGitLog); } } # At least one commit is required for non-dev releases if ($bReleaseCheckCommit && @stryReleaseCommitRemaining == 0) { confess &log(ERROR, "no commits found for release ${strVersion}"); } } # Display versions in TOC? my $bTOC = true; # Create a release section if ($bReleaseDev) { if ($iDevReleaseTotal > 1) { confess &log(ERROR, 'only one development release is allowed'); } $oSection = $oDoc->nodeAdd('section', undef, {id => 'development', if => "'{[dev]}' eq 'y'"}); $oSection->nodeAdd('title')->textSet("Development Notes"); $iDevReleaseTotal++; } elsif ($iCurrentReleaseTotal == 0) { $oSection = $oDoc->nodeAdd('section', undef, {id => 'current'}); $oSection->nodeAdd('title')->textSet("Current Stable Release"); $iCurrentReleaseTotal++; } elsif ($strVersion ge '1.00') { if ($iStableReleaseTotal == 0) { $oSection = $oDoc->nodeAdd('section', undef, {id => 'supported'}); $oSection->nodeAdd('title')->textSet("Stable Releases"); } $iStableReleaseTotal++; $bTOC = false; } else { if ($iUnsupportedReleaseTotal == 0) { $oSection = $oDoc->nodeAdd('section', undef, {id => 'unsupported'}); $oSection->nodeAdd('title')->textSet("Pre-Stable Releases"); } $iUnsupportedReleaseTotal++; $bTOC = false; } # Format the date my $strDate = $oRelease->paramGet('date'); my $strDateOut = ""; my @stryMonth = ('January', 'February', 'March', 'April', 'May', 'June', 'July', 'August', 'September', 'October', 'November', 'December'); if ($strDate =~ /^X/) { $strDateOut .= 'No Release Date Set'; } else { if ($strDate !~ /^(XXXX-XX-XX)|([0-9]{4}-[0-9]{2}-[0-9]{2})$/) { confess &log(ASSERT, "invalid date ${strDate} for release {$strVersion}"); } $strDateOut .= 'Released ' . $stryMonth[(substr($strDate, 5, 2) - 1)] . ' ' . (substr($strDate, 8, 2) + 0) . ', ' . substr($strDate, 0, 4); } # Add section and titles my $oReleaseSection = $oSection->nodeAdd('section', undef, {id => $strVersion, toc => !$bTOC ? 'n' : undef}); $oReleaseSection->paramSet(XML_SECTION_PARAM_ANCHOR, XML_SECTION_PARAM_ANCHOR_VALUE_NOINHERIT); $oReleaseSection->nodeAdd('title')->textSet( "v${strVersion} " . ($bReleaseDev ? '' : 'Release ') . 'Notes'); $oReleaseSection->nodeAdd('subtitle')->textSet($oRelease->paramGet('title')); $oReleaseSection->nodeAdd('subsubtitle')->textSet($strDateOut); # Add release sections my $bAdditionalNotes = false; my $bReleaseNote = false; my $hSectionType = { &XML_RELEASE_CORE_LIST => {title => 'Core', type => 'core'}, &XML_RELEASE_DOC_LIST => {title => 'Documentation', type => 'doc'}, &XML_RELEASE_TEST_LIST => {title => 'Test Suite', type => 'test'}, }; foreach my $strSectionType (XML_RELEASE_CORE_LIST, XML_RELEASE_DOC_LIST, XML_RELEASE_TEST_LIST) { if ($oRelease->nodeTest($strSectionType)) { # Add release item types my $hItemType = { &XML_RELEASE_BUG_LIST => {title => 'Bug Fixes', type => 'bug'}, &XML_RELEASE_FEATURE_LIST => {title => 'Features', type => 'feature'}, &XML_RELEASE_IMPROVEMENT_LIST => {title => 'Improvements', type => 'improvement'}, &XML_RELEASE_DEVELOPMENT_LIST => {title => 'Development', type => 'development'}, }; foreach my $strItemType ( XML_RELEASE_BUG_LIST, XML_RELEASE_FEATURE_LIST, XML_RELEASE_IMPROVEMENT_LIST, XML_RELEASE_DEVELOPMENT_LIST) { next if (!$self->{bDev} && $strItemType eq XML_RELEASE_DEVELOPMENT_LIST); if ($oRelease->nodeGet($strSectionType)->nodeTest($strItemType)) { if ($strSectionType ne XML_RELEASE_CORE_LIST && !$bAdditionalNotes) { $oReleaseSection->nodeAdd('subtitle')->textSet('Additional Notes'); $bAdditionalNotes = true; } # Add release note if present if (!$bReleaseNote && defined($oRelease->nodeGet($strSectionType)->textGet(false))) { $oReleaseSection->nodeAdd('p')->textSet($oRelease->nodeGet($strSectionType)->textGet()); $bReleaseNote = true; } my $strTypeText = ($strSectionType eq XML_RELEASE_CORE_LIST ? '' : $$hSectionType{$strSectionType}{title}) . ' ' . $$hItemType{$strItemType}{title} . ':'; $oReleaseSection-> nodeAdd('p')->textSet( {name => 'text', children=> [{name => 'b', value => $strTypeText}]}); my $oList = $oReleaseSection->nodeAdd('list'); # Add release items foreach my $oReleaseFeature ($oRelease->nodeGet($strSectionType)-> nodeGet($strItemType)->nodeList('release-item')) { my @rhyReleaseItemP = $oReleaseFeature->nodeList('p'); my $oReleaseItemText = $rhyReleaseItemP[0]->textGet(); # Check release item commits if ($bReleaseCheckCommit && $strItemType ne XML_RELEASE_DEVELOPMENT_LIST) { my @oyCommit = $oReleaseFeature->nodeList('commit', false); # If no commits found then try to use the description as the commit subject if (@oyCommit == 0) { my $strSubject = $oRender->processText($oReleaseItemText); my $rhCommit = $self->commitFindSubject(\@rhyReleaseCommit, $strSubject, false); if (!defined($rhCommit)) { $self->commitError( "unable to find commit or no subject match for release ${strVersion} item" . " '${strSubject}'", \@stryReleaseCommitRemaining, $rhReleaseCommitRemaining); my $strCommit = $rhCommit->{commit}; @stryReleaseCommitRemaining = grep(!/$strCommit/, @stryReleaseCommitRemaining); } } # Check the rest of the commits to ensure they exist foreach my $oCommit (@oyCommit) { my $strSubject = $oCommit->paramGet('subject'); my $rhCommit = $self->commitFindSubject(\@rhyReleaseCommit, $strSubject, false); if (defined($rhCommit)) { my $strCommit = $rhCommit->{commit}; @stryReleaseCommitRemaining = grep(!/$strCommit/, @stryReleaseCommitRemaining); } else { $self->commitError( "unable to find release ${strVersion} commit subject '${strSubject}' in list", \@stryReleaseCommitRemaining, $rhReleaseCommitRemaining); } } } # Append the rest of the text if (@rhyReleaseItemP > 1) { shift(@rhyReleaseItemP); push(@{$oReleaseItemText->{oDoc}{children}}, ' '); foreach my $rhReleaseItemP (@rhyReleaseItemP) { push(@{$oReleaseItemText->{oDoc}{children}}, @{$rhReleaseItemP->textGet()->{oDoc}{children}}); } } # Append contributor info my $strContributorText = $self->contributorTextGet($oReleaseFeature, $$hItemType{$strItemType}{type}); if (defined($strContributorText)) { push(@{$oReleaseItemText->{oDoc}{children}}, ' ('); push(@{$oReleaseItemText->{oDoc}{children}}, {name => 'i', value => $strContributorText}); push(@{$oReleaseItemText->{oDoc}{children}}, ')'); } # Add the list item $oList->nodeAdd('list-item')->textSet($oReleaseItemText); } } } } } # Error if there are commits left over # if ($bReleaseCheckCommit && @stryReleaseCommitRemaining != 0) # { # $self->commitError( # "unassigned commits for release ${strVersion}", \@stryReleaseCommitRemaining, $rhReleaseCommitRemaining); # } } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oDoc', value => $oDoc} ); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Html/000077500000000000000000000000001416457663300221215ustar00rootroot00000000000000pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Html/DocHtmlBuilder.pm000066400000000000000000000246751416457663300253360ustar00rootroot00000000000000#################################################################################################################################### # DOC HTML BUILDER MODULE #################################################################################################################################### package pgBackRestDoc::Html::DocHtmlBuilder; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use Exporter qw(import); our @EXPORT = qw(); use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; use pgBackRestDoc::Html::DocHtmlElement; #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Create the class hash my $self = {}; bless $self, $class; $self->{strClass} = $class; # Assign function parameters, defaults, and log debug info ( my $strOperation, $self->{strName}, $self->{strTitle}, $self->{strFavicon}, $self->{strLogo}, $self->{strDescription}, $self->{bPretty}, $self->{bCompact}, $self->{strCss}, ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'strName'}, {name => 'strTitle'}, {name => 'strFavicon', required => false}, {name => 'strLogo', required => false}, {name => 'strDescription', required => false}, {name => 'bPretty', default => false}, {name => 'bCompact', default => false}, {name => 'strCss', required => false}, ); $self->{oBody} = new pgBackRestDoc::Html::DocHtmlElement(HTML_BODY); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # indent # # Indent html #################################################################################################################################### sub indent { my $self = shift; my $iDepth = shift; return $self->{bPretty} ? (' ' x $iDepth) : ''; } #################################################################################################################################### # lf # # Add a linefeed. #################################################################################################################################### sub lf { my $self = shift; return $self->{bPretty} ? "\n" : ''; } #################################################################################################################################### # bodyGet # # Get the body element. #################################################################################################################################### sub bodyGet { my $self = shift; return $self->{oBody}; } #################################################################################################################################### # htmlRender # # Render each html element. #################################################################################################################################### sub htmlRender { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oElement, $iDepth ) = logDebugParam ( __PACKAGE__ . '->htmlRender', \@_, {name => 'oElement', trace => true}, {name => 'iDepth', trace => true} ); # If a pre tag add a linefeed before the tag unless the prior tag was also pre. This makes the output more diffable. my $strHtml = ""; if ($oElement->{strType} eq HTML_PRE && !$self->{bPretty}) { if (!$self->{bPrePrior}) { $strHtml .= "\n"; } $self->{bPrePrior} = true; } else { $self->{bPrePrior} = false; } # Build the tag $strHtml .= $self->indent($iDepth) . "<$oElement->{strType}" . (defined($oElement->{strClass}) ? " class=\"$oElement->{strClass}\"": '') . (defined($oElement->{strRef}) ? " href=\"$oElement->{strRef}\"": '') . (defined($oElement->{strId}) ? " id=\"$oElement->{strId}\"": '') . (defined($oElement->{strExtra}) ? " $oElement->{strExtra}": '') . '>'; if (defined($oElement->{strContent})) { if (!defined($oElement->{bPre}) || !$oElement->{bPre}) { $oElement->{strContent} = trim($oElement->{strContent}); # Add a linefeed before the content if not pre. This makes the output more diffable. $strHtml .= "\n"; } else { $oElement->{strContent} =~ s/\&/\&\;/g; } $strHtml .= $oElement->{strContent}; # Add a linefeed after the content if not pre. This makes the output more diffable. if (!defined($oElement->{bPre}) || !$oElement->{bPre}) { $strHtml .= "\n" . $self->indent($iDepth); } } else { if (!($oElement->{strType} eq HTML_A && @{$oElement->{oyElement}} == 0)) { $strHtml .= $self->lf(); } foreach my $oChildElement (@{$oElement->{oyElement}}) { $strHtml .= $self->htmlRender($oChildElement, $iDepth + 1); } if (!($oElement->{strType} eq HTML_A && @{$oElement->{oyElement}} == 0)) { $strHtml .= $self->indent($iDepth); } } $strHtml .= "{strType}>"; # If a pre tag add an lf after the tag. This makes the output more diffable. $strHtml .= $oElement->{strType} eq HTML_PRE ? "\n" : $self->lf(); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strHtml', value => $strHtml, trace => true} ); } #################################################################################################################################### # escape # # Generate the HTML. #################################################################################################################################### sub escape { my $self = shift; my $strBuffer = shift; $strBuffer =~ s/\&/\&\;/g; $strBuffer =~ s/\htmlGet', \@_, {name => 'bAnalytics', optional => true, default => false, trace => true}, ); # Build the header my $strHtml = $self->indent(0) . "" . $self->lf() . $self->indent(0) . "" . $self->lf() . $self->indent(0) . "" . $self->lf() . $self->indent(1) . "\n" . $self->indent(2) . $self->escape($self->{strTitle}) . "\n" . $self->indent(1) . '' . $self->lf() . $self->indent(1) . "\n"; if (!$self->{bCompact}) { $strHtml .= # $self->indent(1) . "\n" . $self->indent(1) . '\n" . $self->indent(1) . '\n" . $self->indent(1) . "\n"; if (defined($self->{strFavicon})) { $strHtml .= $self->indent(1) . "{strFavicon}\" type=\"image/png\">\n"; } if (defined($self->{strLogo})) { $strHtml .= $self->indent(1) . "\n" . $self->indent(1) . "{strLogo}\">\n"; } if (defined($self->{strDescription})) { $strHtml .= $self->indent(1) . '\n" . $self->indent(1) . '\n"; } } if (defined($self->{strCss})) { my $strCss = $self->{strCss}; if (!$self->{bPretty}) { $strCss =~ s/^\s+//mg; $strCss =~ s/\n//g; $strCss =~ s/\/\*.*?\*\///g; } $strHtml .= $self->indent(1) . "\n"; } else { $strHtml .= $self->indent(1) . "\n"; } if ($bAnalytics) { $strHtml .= $self->indent(1) . "\n" . $self->indent(1) . "\n"; } $strHtml .= $self->indent(0) . "" . $self->lf() . $self->htmlRender($self->bodyGet(), 0); # Complete the html $strHtml .= $self->indent(0) . "" . $self->lf(); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strHtml', value => $strHtml, trace => true} ); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Html/DocHtmlElement.pm000066400000000000000000000125371416457663300253330ustar00rootroot00000000000000#################################################################################################################################### # DOC HTML ELEMENT MODULE #################################################################################################################################### package pgBackRestDoc::Html::DocHtmlElement; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use Exporter qw(import); our @EXPORT = qw(); use Scalar::Util qw(blessed); use pgBackRestDoc::Common::Log; #################################################################################################################################### # Html Element Types #################################################################################################################################### use constant HTML_A => 'a'; push @EXPORT, qw(HTML_A); use constant HTML_BODY => 'body'; push @EXPORT, qw(HTML_BODY); use constant HTML_PRE => 'pre'; push @EXPORT, qw(HTML_PRE); use constant HTML_DIV => 'div'; push @EXPORT, qw(HTML_DIV); use constant HTML_SPAN => 'span'; push @EXPORT, qw(HTML_SPAN); use constant HTML_TABLE => 'table'; push @EXPORT, qw(HTML_TABLE); use constant HTML_TABLE_CAPTION => 'caption'; push @EXPORT, qw(HTML_TABLE_CAPTION); use constant HTML_TD => 'td'; push @EXPORT, qw(HTML_TD); use constant HTML_TH => 'th'; push @EXPORT, qw(HTML_TH); use constant HTML_TR => 'tr'; push @EXPORT, qw(HTML_TR); use constant HTML_UL => 'ul'; push @EXPORT, qw(HTML_UL); use constant HTML_LI => 'li'; push @EXPORT, qw(HTML_LI); #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Create the class hash my $self = {}; bless $self, $class; $self->{strClass} = $class; # Assign function parameters, defaults, and log debug info ( my $strOperation, $self->{strType}, $self->{strClass}, my $oParam ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'strType', trace => true}, {name => 'strClass', required => false, trace => true}, {name => 'oParam', required => false, trace => true} ); $self->{oyElement} = []; $self->{strContent} = $$oParam{strContent}; $self->{strId} = $$oParam{strId}; $self->{strRef} = $$oParam{strRef}; $self->{strExtra} = $$oParam{strExtra}; $self->{bPre} = $$oParam{bPre}; # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # addNew # # Create a new element and add it. #################################################################################################################################### sub addNew { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $strType, $strClass, $oParam ) = logDebugParam ( __PACKAGE__ . '->addNew', \@_, {name => 'strType', trace => true}, {name => 'strClass', required => false, trace => true}, {name => 'oParam', required => false, trace => true} ); my $oElement = new pgBackRestDoc::Html::DocHtmlElement($strType, $strClass, $oParam); $self->add($oElement); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oElement', value => $oElement, trace => true} ); } #################################################################################################################################### # add # # Add an element. #################################################################################################################################### sub add { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oElement ) = logDebugParam ( __PACKAGE__ . '->add', \@_, {name => 'oElement', trace => true} ); if (!(blessed($oElement) && $oElement->isa('pgBackRestDoc::Html::DocHtmlElement'))) { confess &log(ASSERT, 'oElement must be a valid element object'); } push(@{$self->{oyElement}}, $oElement); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oElement', value => $oElement, trace => true} ); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Html/DocHtmlPage.pm000066400000000000000000000632001416457663300246070ustar00rootroot00000000000000#################################################################################################################################### # DOC HTML PAGE MODULE #################################################################################################################################### package pgBackRestDoc::Html::DocHtmlPage; use parent 'pgBackRestDoc::Common::DocExecute'; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use Data::Dumper; use Exporter qw(import); our @EXPORT = qw(); use pgBackRestDoc::Common::DocConfig; use pgBackRestDoc::Common::DocManifest; use pgBackRestDoc::Common::DocRender; use pgBackRestDoc::Html::DocHtmlBuilder; use pgBackRestDoc::Html::DocHtmlElement; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Assign function parameters, defaults, and log debug info my ( $strOperation, $oManifest, $strRenderOutKey, $bMenu, $bExe, $bCompact, $strCss, $bPretty, ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'oManifest'}, {name => 'strRenderOutKey'}, {name => 'bMenu'}, {name => 'bExe'}, {name => 'bCompact'}, {name => 'strCss'}, {name => 'bPretty'}, ); # Create the class hash my $self = $class->SUPER::new(RENDER_TYPE_HTML, $oManifest, $strRenderOutKey, $bExe); bless $self, $class; $self->{bMenu} = $bMenu; $self->{bCompact} = $bCompact; $self->{strCss} = $strCss; $self->{bPretty} = $bPretty; # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # process # # Generate the site html #################################################################################################################################### sub process { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->process'); # Working variables my $oPage = $self->{oDoc}; my $oRender = $self->{oManifest}->renderGet(RENDER_TYPE_HTML); # Initialize page my $strTitle = $oPage->paramGet('title'); my $strSubTitle = $oPage->paramGet('subtitle', false); my $oHtmlBuilder = new pgBackRestDoc::Html::DocHtmlBuilder( $self->{oManifest}->variableReplace('{[project]}' . (defined($self->{oManifest}->variableGet('project-tagline')) ? ' - ' . $self->{oManifest}->variableGet('project-tagline') : '')), $self->{oManifest}->variableReplace($strTitle . (defined($strSubTitle) ? " - ${strSubTitle}" : '')), $self->{oManifest}->variableGet('project-favicon'), $self->{oManifest}->variableGet('project-logo'), $self->{oManifest}->variableReplace(trim($self->{oDoc}->fieldGet('description'))), $self->{bPretty}, $self->{bCompact}, $self->{bCompact} ? $self->{strCss} : undef); # Generate header my $oPageHeader = $oHtmlBuilder->bodyGet()->addNew(HTML_DIV, 'page-header'); # add the logo to the header if (defined($self->{oManifest}->variableGet('html-logo'))) { $oPageHeader-> addNew(HTML_DIV, 'page-header-logo', {strContent =>"{[html-logo]}"}); } $oPageHeader-> addNew(HTML_DIV, 'page-header-title', {strContent => $strTitle}); if (defined($strSubTitle)) { $oPageHeader-> addNew(HTML_DIV, 'page-header-subtitle', {strContent => $strSubTitle}); } # Generate menu if ($self->{bMenu}) { my $oMenuBody = $oHtmlBuilder->bodyGet()->addNew(HTML_DIV, 'page-menu')->addNew(HTML_DIV, 'menu-body'); # Get the menu in the order listed in the manifest.xml foreach my $strRenderOutKey (@{${$oRender}{stryOrder}}) { # Do not output the menu item for the page the user is on (e.g. on Command page, the Command menu item will not appear) if ($strRenderOutKey ne $self->{strRenderOutKey}) { my $oRenderOut = $self->{oManifest}->renderOutGet(RENDER_TYPE_HTML, $strRenderOutKey); if (defined($$oRenderOut{menu})) { $oMenuBody->addNew(HTML_DIV, 'menu')->addNew( HTML_A, 'menu-link', {strContent => $$oRenderOut{menu}, strRef => $strRenderOutKey eq 'index' ? '{[project-url-root]}' : "${strRenderOutKey}.html"}); } } } } # Generate table of contents my $oPageTocBody; if ($self->{bToc}) { my $oPageToc = $oHtmlBuilder->bodyGet()->addNew(HTML_DIV, 'page-toc'); $oPageToc->addNew(HTML_DIV, 'page-toc-header')->addNew(HTML_DIV, 'page-toc-title', {strContent => "Table of Contents"}); $oPageTocBody = $oPageToc-> addNew(HTML_DIV, 'page-toc-body'); } # Generate body my $oPageBody = $oHtmlBuilder->bodyGet()->addNew(HTML_DIV, 'page-body'); my $iSectionNo = 1; # Render sections foreach my $oSection ($oPage->nodeList('section')) { my ($oChildSectionElement, $oChildSectionTocElement) = $self->sectionProcess($oSection, undef, "${iSectionNo}", 1); $oPageBody->add($oChildSectionElement); if (defined($oPageTocBody) && defined($oChildSectionTocElement)) { $oPageTocBody->add($oChildSectionTocElement); } $iSectionNo++; } my $oPageFooter = $oHtmlBuilder->bodyGet()-> addNew(HTML_DIV, 'page-footer', {strContent => '{[html-footer]}'}); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strHtml', value => $oHtmlBuilder->htmlGet( {bAnalytics => defined($self->{oManifest}->variableGet('analytics')) && $self->{oManifest}->variableGet('analytics') eq 'y'}), trace => true} ); } #################################################################################################################################### # sectionProcess #################################################################################################################################### sub sectionProcess { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oSection, $strAnchor, $strSectionNo, $iDepth ) = logDebugParam ( __PACKAGE__ . '->sectionProcess', \@_, {name => 'oSection'}, {name => 'strAnchor', required => false}, {name => 'strSectionNo'}, {name => 'iDepth'} ); if ($oSection->paramGet('log')) { &log(INFO, (' ' x ($iDepth + 1)) . 'process section: ' . $oSection->paramGet('path')); } if ($iDepth > 3) { confess &log(ASSERT, "section depth of ${iDepth} exceeds maximum"); } # Working variables $strAnchor = ($oSection->paramTest(XML_SECTION_PARAM_ANCHOR, XML_SECTION_PARAM_ANCHOR_VALUE_NOINHERIT) ? '' : (defined($strAnchor) ? "${strAnchor}/" : '')) . $oSection->paramGet('id'); # Create the section toc element my $oSectionTocElement = new pgBackRestDoc::Html::DocHtmlElement(HTML_DIV, "section${iDepth}-toc"); # Create the section element my $oSectionElement = new pgBackRestDoc::Html::DocHtmlElement(HTML_DIV, "section${iDepth}"); # Add the section anchor $oSectionElement->addNew(HTML_A, undef, {strId => $strAnchor}); # Add the section title to section and toc my $oSectionHeaderElement = $oSectionElement->addNew(HTML_DIV, "section${iDepth}-header"); my $strSectionTitle = $self->processText($oSection->nodeGet('title')->textGet()); if ($self->{bTocNumber}) { $oSectionHeaderElement->addNew(HTML_DIV, "section${iDepth}-number", {strContent => $strSectionNo}); } $oSectionHeaderElement->addNew(HTML_DIV, "section${iDepth}-title", {strContent => $strSectionTitle}); if ($self->{bTocNumber}) { $oSectionTocElement->addNew(HTML_DIV, "section${iDepth}-toc-number", {strContent => $strSectionNo}); } my $oTocSectionTitleElement = $oSectionTocElement->addNew(HTML_DIV, "section${iDepth}-toc-title"); $oTocSectionTitleElement->addNew( HTML_A, undef, {strContent => $strSectionTitle, strRef => "#${strAnchor}"}); # Add the section intro if it exists if (defined($oSection->textGet(false))) { $oSectionElement-> addNew(HTML_DIV, "section-intro", {strContent => $self->processText($oSection->textGet())}); } # Add the section body my $oSectionBodyElement = $oSectionElement->addNew(HTML_DIV, "section-body"); # Process each child my $iSectionNo = 1; foreach my $oChild ($oSection->nodeList()) { &log(DEBUG, (' ' x ($iDepth + 2)) . 'process child ' . $oChild->nameGet()); # Execute a command if ($oChild->nameGet() eq 'execute-list') { my $bShow = $oChild->paramTest('show', 'n') ? false : true; my $oExecuteBodyElement; my $bFirst = true; my $strHostName = $self->{oManifest}->variableReplace($oChild->paramGet('host')); if ($bShow) { my $oSectionBodyExecute = $oSectionBodyElement->addNew(HTML_DIV, "execute"); $oSectionBodyExecute-> addNew(HTML_DIV, "execute-title", {strContent => "${strHostName} " . $self->processText($oChild->nodeGet('title')->textGet())}); $oExecuteBodyElement = $oSectionBodyExecute->addNew(HTML_DIV, "execute-body"); } foreach my $oExecute ($oChild->nodeList('execute')) { my $bExeShow = !$oExecute->paramTest('show', 'n'); my $bExeExpectedError = defined($oExecute->paramGet('err-expect', false)); my ($strCommand, $strOutput) = $self->execute( $oSection, $strHostName, $oExecute, {iIndent => $iDepth + 3, bShow => $bShow && $bExeShow}); if ($bShow && $bExeShow) { # Add continuation chars and proper spacing $strCommand =~ s/\n/\n /smg; $oExecuteBodyElement-> addNew(HTML_PRE, "execute-body-cmd", {strContent => $strCommand, bPre => true}); my $strHighLight = $self->{oManifest}->variableReplace($oExecute->fieldGet('exe-highlight', false)); my $bHighLightFound = false; if (defined($strOutput)) { my $bHighLightOld; my $strHighLightOutput; if ($oExecute->fieldTest('exe-highlight-type', 'error')) { $bExeExpectedError = true; } foreach my $strLine (split("\n", $strOutput)) { my $bHighLight = defined($strHighLight) && $strLine =~ /$strHighLight/; if (defined($bHighLightOld) && $bHighLight != $bHighLightOld) { $oExecuteBodyElement-> addNew(HTML_PRE, 'execute-body-output' . ($bHighLightOld ? '-highlight' . ($bExeExpectedError ? '-error' : '') : ''), {strContent => $strHighLightOutput, bPre => true}); undef($strHighLightOutput); } $strHighLightOutput .= (defined($strHighLightOutput) ? "\n" : '') . $strLine; $bHighLightOld = $bHighLight; $bHighLightFound = $bHighLightFound ? true : $bHighLight ? true : false; } if (defined($bHighLightOld)) { $oExecuteBodyElement-> addNew(HTML_PRE, 'execute-body-output' . ($bHighLightOld ? '-highlight' . ($bExeExpectedError ? '-error' : '') : ''), {strContent => $strHighLightOutput, bPre => true}); } $bFirst = true; } if ($self->{bExe} && $self->isRequired($oSection) && defined($strHighLight) && !$bHighLightFound) { confess &log(ERROR, "unable to find a match for highlight: ${strHighLight}"); } } $bFirst = false; } } # Add code block elsif ($oChild->nameGet() eq 'code-block') { my $strValue = $oChild->valueGet(); # Trim linefeeds from the beginning and all whitespace from the end $strValue =~ s/^\n+|\s+$//g; # Find the line with the fewest leading spaces my $iSpaceMin = undef; foreach my $strLine (split("\n", $strValue)) { $strLine =~ s/\s+$//; my $iSpaceMinTemp = length($strLine) - length(trim($strLine)); if (!defined($iSpaceMin) || $iSpaceMinTemp < $iSpaceMin) { $iSpaceMin = $iSpaceMinTemp; } } # Replace the leading spaces $strValue =~ s/^( ){$iSpaceMin}//smg; $oSectionBodyElement->addNew( HTML_PRE, 'code-block', {strContent => $strValue, bPre => true}); } # Add table elsif ($oChild->nameGet() eq 'table') { my $oTableTitle; if ($oChild->nodeTest('title')) { $oTableTitle = $oChild->nodeGet('title'); } my $oTableElement = $oSectionBodyElement->addNew(HTML_TABLE, 'table'); my @oyColumn; # If there is a title element then add it as the caption for the table if (defined($oTableTitle)) { # Print the label (e.g. Table 1:) in front of the title if one exists my $strTableTitle = $oTableTitle->paramTest('label') ? ($oTableTitle->paramGet('label') . ': '. $self->processText($oTableTitle->textGet())) : $self->processText($oTableTitle->textGet()); $oTableElement->addNew(HTML_TABLE_CAPTION, 'table-caption', {strContent => $strTableTitle}); } # Build the header if ($oChild->nodeTest('table-header')) { my $oHeader = $oChild->nodeGet('table-header'); @oyColumn = $oHeader->nodeList('table-column'); my $oHeaderRowElement = $oTableElement->addNew(HTML_TR, 'table-header-row'); foreach my $oColumn (@oyColumn) { # Each column can have different alignment properties - if not set, then default to align left my $strAlign = $oColumn->paramGet("align", false, 'left'); my $bFill = $oColumn->paramTest('fill', 'y'); $oHeaderRowElement->addNew( HTML_TH, "table-header-${strAlign}" . ($bFill ? " table-header-fill" : ""), {strContent => $self->processText($oColumn->textGet())}); } } # Build the rows foreach my $oRow ($oChild->nodeGet('table-data')->nodeList('table-row')) { my $oRowElement = $oTableElement->addNew(HTML_TR, 'table-row'); my @oRowCellList = $oRow->nodeList('table-cell'); for (my $iRowCellIdx = 0; $iRowCellIdx < @oRowCellList; $iRowCellIdx++) { my $oRowCell = $oRowCellList[$iRowCellIdx]; # If a header row was defined, then get the column alignment, else default to left my $strAlign = @oyColumn > 0 ? $oyColumn[$iRowCellIdx]->paramGet("align", false, 'left') : 'left'; $oRowElement->addNew( HTML_TD, "table-data-${strAlign}", {strContent => $self->processText($oRowCell->textGet())}); } } } # Add descriptive text elsif ($oChild->nameGet() eq 'p') { $oSectionBodyElement-> addNew(HTML_DIV, 'section-body-text', {strContent => $self->processText($oChild->textGet())}); } # Add option descriptive text elsif ($oChild->nameGet() eq 'option-description') { my $strOption = $oChild->paramGet("key"); my $oDescription = ${$self->{oReference}->{oConfigHash}}{&CONFIG_HELP_OPTION}{$strOption}{&CONFIG_HELP_DESCRIPTION}; if (!defined($oDescription)) { confess &log(ERROR, "unable to find ${strOption} option in sections - try adding option?"); } $oSectionBodyElement-> addNew(HTML_DIV, 'section-body-text', {strContent => $self->processText($oDescription)}); } # Add cmd descriptive text elsif ($oChild->nameGet() eq 'cmd-description') { my $strCommand = $oChild->paramGet("key"); my $oDescription = ${$self->{oReference}->{oConfigHash}}{&CONFIG_HELP_COMMAND}{$strCommand}{&CONFIG_HELP_DESCRIPTION}; if (!defined($oDescription)) { confess &log(ERROR, "unable to find ${strCommand} command in sections - try adding command?"); } $oSectionBodyElement-> addNew(HTML_DIV, 'section-body-text', {strContent => $self->processText($oDescription)}); } # Add/remove backrest config options elsif ($oChild->nameGet() eq 'backrest-config') { my $oConfigElement = $self->backrestConfigProcess($oSection, $oChild, $iDepth + 3); if (defined($oConfigElement)) { $oSectionBodyElement->add($oConfigElement); } } # Add/remove postgres config options elsif ($oChild->nameGet() eq 'postgres-config') { my $oConfigElement = $self->postgresConfigProcess($oSection, $oChild, $iDepth + 3); if (defined($oConfigElement)) { $oSectionBodyElement->add($oConfigElement); } } # Add a list elsif ($oChild->nameGet() eq 'list') { my $oList = $oSectionBodyElement->addNew(HTML_UL, 'list-unordered'); foreach my $oListItem ($oChild->nodeList()) { $oList->addNew(HTML_LI, 'list-unordered', {strContent => $self->processText($oListItem->textGet())}); } } # Add a subtitle elsif ($oChild->nameGet() eq 'subtitle') { $oSectionBodyElement-> addNew(HTML_DIV, "section${iDepth}-subtitle", {strContent => $self->processText($oChild->textGet())}); } # Add a subsubtitle elsif ($oChild->nameGet() eq 'subsubtitle') { $oSectionBodyElement-> addNew(HTML_DIV, "section${iDepth}-subsubtitle", {strContent => $self->processText($oChild->textGet())}); } # Add a subsection elsif ($oChild->nameGet() eq 'section') { my ($oChildSectionElement, $oChildSectionTocElement) = $self->sectionProcess($oChild, $strAnchor, "${strSectionNo}.${iSectionNo}", $iDepth + 1); $oSectionBodyElement->add($oChildSectionElement); if (defined($oChildSectionTocElement)) { $oSectionTocElement->add($oChildSectionTocElement); } $iSectionNo++; } # Add an admonition (e.g. NOTE, WARNING, etc) elsif ($oChild->nameGet() eq 'admonition') { my $oAdmonition = $oSectionBodyElement->addNew(HTML_DIV, 'admonition'); $oAdmonition->addNew(HTML_DIV, $oChild->paramGet('type'), {strContent => uc($oChild->paramGet('type')) . ": "}); $oAdmonition->addNew(HTML_DIV, $oChild->paramGet('type') . '-text', {strContent => $self->processText($oChild->textGet())}); } # Check if the child can be processed by a parent else { $self->sectionChildProcess($oSection, $oChild, $iDepth + 1); } } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oSectionElement', value => $oSectionElement, trace => true}, {name => 'oSectionTocElement', value => $oSection->paramTest('toc', 'n') ? undef : $oSectionTocElement, trace => true} ); } #################################################################################################################################### # backrestConfigProcess #################################################################################################################################### sub backrestConfigProcess { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oSection, $oConfig, $iDepth ) = logDebugParam ( __PACKAGE__ . '->backrestConfigProcess', \@_, {name => 'oSection'}, {name => 'oConfig'}, {name => 'iDepth'} ); # Generate the config my $oConfigElement; my ($strFile, $strConfig, $bShow) = $self->backrestConfig($oSection, $oConfig, $iDepth); if ($bShow) { my $strHostName = $self->{oManifest}->variableReplace($oConfig->paramGet('host')); # Render the config $oConfigElement = new pgBackRestDoc::Html::DocHtmlElement(HTML_DIV, "config"); $oConfigElement-> addNew(HTML_DIV, "config-title", {strContent => "${strHostName}:${strFile}" . " " . $self->processText($oConfig->nodeGet('title')->textGet())}); my $oConfigBodyElement = $oConfigElement->addNew(HTML_DIV, "config-body"); # # $oConfigBodyElement-> # addNew(HTML_DIV, "config-body-title", # {strContent => "${strFile}:"}); # Convert linefeeds to br tags $strConfig =~ s/\n/\n/g; $oConfigBodyElement-> addNew(HTML_DIV, "config-body-output", {strContent => $strConfig}); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oConfigElement', value => $oConfigElement, trace => true} ); } #################################################################################################################################### # postgresConfigProcess #################################################################################################################################### sub postgresConfigProcess { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oSection, $oConfig, $iDepth ) = logDebugParam ( __PACKAGE__ . '->postgresConfigProcess', \@_, {name => 'oSection'}, {name => 'oConfig'}, {name => 'iDepth'} ); # Generate the config my $oConfigElement; my ($strFile, $strConfig, $bShow) = $self->postgresConfig($oSection, $oConfig, $iDepth); if ($bShow) { # Render the config my $strHostName = $self->{oManifest}->variableReplace($oConfig->paramGet('host')); $oConfigElement = new pgBackRestDoc::Html::DocHtmlElement(HTML_DIV, "config"); $oConfigElement-> addNew(HTML_DIV, "config-title", {strContent => "${strHostName}:${strFile}" . " " . $self->processText($oConfig->nodeGet('title')->textGet())}); my $oConfigBodyElement = $oConfigElement->addNew(HTML_DIV, "config-body"); # $oConfigBodyElement-> # addNew(HTML_DIV, "config-body-title", # {strContent => "append to ${strFile}:"}); # Convert linefeeds to br tags $strConfig =~ s/\n/\n/g; $oConfigBodyElement-> addNew(HTML_DIV, "config-body-output", {strContent => defined($strConfig) ? $strConfig : ''}); $oConfig->fieldSet('actual-config', $strConfig); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'oConfigElement', value => $oConfigElement, trace => true} ); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Html/DocHtmlSite.pm000066400000000000000000000130451416457663300246410ustar00rootroot00000000000000#################################################################################################################################### # DOC HTML SITE MODULE #################################################################################################################################### package pgBackRestDoc::Html::DocHtmlSite; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use English '-no_match_vars'; use Data::Dumper; use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname); use File::Copy; use POSIX qw(strftime); use Storable qw(dclone); use pgBackRestTest::Common::ExecuteTest; use pgBackRestDoc::Common::DocConfig; use pgBackRestDoc::Common::DocManifest; use pgBackRestDoc::Common::Exception; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; use pgBackRestDoc::Html::DocHtmlPage; use pgBackRestDoc::ProjectInfo; #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Create the class hash my $self = {}; bless $self, $class; $self->{strClass} = $class; # Assign function parameters, defaults, and log debug info ( my $strOperation, $self->{oManifest}, $self->{strXmlPath}, $self->{strHtmlPath}, $self->{strCssFile}, $self->{strFaviconFile}, $self->{strProjectLogoFile}, $self->{bExe} ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'oManifest'}, {name => 'strXmlPath'}, {name => 'strHtmlPath'}, {name => 'strCssFile'}, {name => 'strFaviconFile', required => false}, {name => 'strProjectLogoFile', required => false}, {name => 'bExe'} ); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # process # # Generate the site html #################################################################################################################################### sub process { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->process'); # Get render options my $oRender = $self->{oManifest}->renderGet(RENDER_TYPE_HTML); my $bMenu = $$oRender{&RENDER_MENU}; my $bPretty = $$oRender{&RENDER_PRETTY}; my $bCompact = $$oRender{&RENDER_COMPACT}; if (!$bCompact) { # Copy the css file my $strCssFileDestination = "$self->{strHtmlPath}/default.css"; copy($self->{strCssFile}, $strCssFileDestination) or confess &log(ERROR, "unable to copy $self->{strCssFile} to ${strCssFileDestination}"); # Copy the favicon file if (defined($self->{strFaviconFile})) { my $strFaviconFileDestination = "$self->{strHtmlPath}/" . $self->{oManifest}->variableGet('project-favicon'); copy($self->{strFaviconFile}, $strFaviconFileDestination) or confess &log(ERROR, "unable to copy $self->{strFaviconFile} to ${strFaviconFileDestination}"); } # Copy the project logo file if (defined($self->{strProjectLogoFile})) { my $strProjectLogoFileDestination = "$self->{strHtmlPath}/" . $self->{oManifest}->variableGet('project-logo'); copy($self->{strProjectLogoFile}, $strProjectLogoFileDestination) or confess &log(ERROR, "unable to copy $self->{strProjectLogoFile} to ${strProjectLogoFileDestination}"); } } foreach my $strPageId ($self->{oManifest}->renderOutList(RENDER_TYPE_HTML)) { &log(INFO, " render out: ${strPageId}"); my $strHtml; my $oRenderOut = $self->{oManifest}->renderOutGet(RENDER_TYPE_HTML, $strPageId); eval { $strHtml = $self->{oManifest}->variableReplace( new pgBackRestDoc::Html::DocHtmlPage( $self->{oManifest}, $strPageId, $bMenu, $self->{bExe}, $bCompact, ${$self->{oManifest}->storage()->get($self->{strCssFile})}, $bPretty)->process()); return true; } or do { my $oException = $@; if (exceptionCode($oException) == ERROR_FILE_INVALID) { my $oRenderOut = $self->{oManifest}->renderOutGet(RENDER_TYPE_HTML, $strPageId); $self->{oManifest}->cacheReset($$oRenderOut{source}); $strHtml = $self->{oManifest}->variableReplace( new pgBackRestDoc::Html::DocHtmlPage( $self->{oManifest}, $strPageId, $bMenu, $self->{bExe}, $bCompact, ${$self->{oManifest}->storage()->get($self->{strCssFile})}, $bPretty)->process()); } else { confess $oException; } }; # Save the html page my $strFile = "$self->{strHtmlPath}/" . (defined($$oRenderOut{file}) ? $$oRenderOut{file} : "${strPageId}.html"); $self->{oManifest}->storage()->put($strFile, $strHtml); } # Return from function and log return values if any logDebugReturn($strOperation); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Latex/000077500000000000000000000000001416457663300222725ustar00rootroot00000000000000pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Latex/DocLatex.pm000066400000000000000000000136751416457663300243470ustar00rootroot00000000000000#################################################################################################################################### # DOC LATEX MODULE #################################################################################################################################### package pgBackRestDoc::Latex::DocLatex; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use English '-no_match_vars'; use Cwd qw(abs_path); use Data::Dumper; use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname basename); use File::Copy; use POSIX qw(strftime); use Storable qw(dclone); use pgBackRestTest::Common::ExecuteTest; use pgBackRestDoc::Common::DocConfig; use pgBackRestDoc::Common::DocManifest; use pgBackRestDoc::Common::Exception; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; use pgBackRestDoc::Latex::DocLatexSection; use pgBackRestDoc::ProjectInfo; #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Create the class hash my $self = {}; bless $self, $class; $self->{strClass} = $class; # Assign function parameters, defaults, and log debug info ( my $strOperation, $self->{oManifest}, $self->{strXmlPath}, $self->{strLatexPath}, $self->{strPreambleFile}, $self->{bExe} ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'oManifest'}, {name => 'strXmlPath'}, {name => 'strLatexPath'}, {name => 'strPreambleFile'}, {name => 'bExe'} ); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # process # # Generate the site html #################################################################################################################################### sub process { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->process'); my $oRender = $self->{oManifest}->renderGet(RENDER_TYPE_PDF); my $strLogo = $self->{oManifest}->variableGet('pdf-resource-logo'); if (!defined($strLogo)) { $strLogo = 'blank.eps'; } my ($strExt) = $strLogo =~ /(\.[^.]+)$/; my $strLogoPath = defined($self->{oManifest}->variableGet('pdf-resource-path')) ? $self->{oManifest}->variableGet('pdf-resource-path') : "$self->{oManifest}{strDocPath}/resource/latex/"; # Copy the logo copy($strLogoPath . $strLogo, "$self->{strLatexPath}/logo$strExt") or confess &log(ERROR, "unable to copy logo"); my $strLatex = $self->{oManifest}->variableReplace( ${$self->{oManifest}->storage()->get($self->{strPreambleFile})}, 'latex') . "\n"; # ??? Temp hack for underscores in filename $strLatex =~ s/pgaudit\\\_doc/pgaudit\_doc/g; # Process the sources in the order listed in the manifest.xml foreach my $strPageId (@{${$self->{oManifest}->renderGet(RENDER_TYPE_PDF)}{stryOrder}}) { &log(INFO, " render out: ${strPageId}"); eval { my $oDocLatexSection = new pgBackRestDoc::Latex::DocLatexSection($self->{oManifest}, $strPageId, $self->{bExe}); # Save the html page $strLatex .= $oDocLatexSection->process(); return true; } or do { my $oException = $EVAL_ERROR; if (exceptionCode($oException) == ERROR_FILE_INVALID) { my $oRenderOut = $self->{oManifest}->renderOutGet(RENDER_TYPE_HTML, $strPageId); $self->{oManifest}->cacheReset($$oRenderOut{source}); my $oDocLatexSection = new pgBackRestDoc::Latex::DocLatexSection($self->{oManifest}, $strPageId, $self->{bExe}); # Save the html page $strLatex .= $oDocLatexSection->process(); } else { confess $oException; } }; } $strLatex .= "\n% " . ('-' x 130) . "\n% End document\n% " . ('-' x 130) . "\n\\end{document}\n"; # Get base name of output file to use for processing (my $strLatexFileBase = basename($$oRender{file})) =~ s/\.[^.]+$//; $strLatexFileBase = $self->{oManifest}->variableReplace($strLatexFileBase); # Name of latex file to use for output and processing my $strLatexFileName = $self->{oManifest}->variableReplace("$self->{strLatexPath}/" . $strLatexFileBase . '.tex'); # Output latex and build PDF $self->{oManifest}->storage()->put($strLatexFileName, $strLatex); executeTest("pdflatex -output-directory=$self->{strLatexPath} -shell-escape $strLatexFileName", {bSuppressStdErr => true}); executeTest("pdflatex -output-directory=$self->{strLatexPath} -shell-escape $strLatexFileName", {bSuppressStdErr => true}); # Determine path of output file my $strLatexOutputName = $oRender->{file}; if ($strLatexOutputName !~ /^\//) { $strLatexOutputName = abs_path($self->{strLatexPath} . "/" . $oRender->{file}); } # Copy pdf file if it is not already in the correct place if ($strLatexOutputName ne "$self->{strLatexPath}/" . $strLatexFileBase . '.pdf') { copy("$self->{strLatexPath}/" . $strLatexFileBase . '.pdf', $strLatexOutputName) or confess &log(ERROR, "unable to copy pdf to " . $strLatexOutputName); } # Return from function and log return values if any logDebugReturn($strOperation); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Latex/DocLatexSection.pm000066400000000000000000000375701416457663300256740ustar00rootroot00000000000000#################################################################################################################################### # DOC LATEX SECTION MODULE #################################################################################################################################### package pgBackRestDoc::Latex::DocLatexSection; use parent 'pgBackRestDoc::Common::DocExecute'; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use Exporter qw(import); our @EXPORT = qw(); use pgBackRestDoc::Common::DocConfig; use pgBackRestDoc::Common::DocManifest; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Assign function parameters, defaults, and log debug info my ( $strOperation, $oManifest, $strRenderOutKey, $bExe ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'oManifest'}, {name => 'strRenderOutKey'}, {name => 'bExe'} ); # Create the class hash my $self = $class->SUPER::new('latex', $oManifest, $strRenderOutKey, $bExe); bless $self, $class; # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # process # # Generate the site html #################################################################################################################################### sub process { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->process'); # Working variables my $oPage = $self->{oDoc}; my $strLatex; # Initialize page my $strTitle = $oPage->paramGet('title'); my $strSubTitle = $oPage->paramGet('subtitle', false); # Render sections foreach my $oSection ($oPage->nodeList('section')) { $strLatex .= (defined($strLatex) ? "\n" : '') . $self->sectionProcess($oSection, undef, 1); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strHtml', value => $strLatex, trace => true} ); } #################################################################################################################################### # sectionProcess #################################################################################################################################### sub sectionProcess { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oSection, $strSection, $iDepth ) = logDebugParam ( __PACKAGE__ . '->sectionRender', \@_, {name => 'oSection'}, {name => 'strSection', required => false}, {name => 'iDepth'} ); if ($oSection->paramGet('log')) { &log(INFO, (' ' x ($iDepth + 1)) . 'process section: ' . $oSection->paramGet('path')); } # Create section type my $strSectionTitle = $self->processText($oSection->nodeGet('title')->textGet()); $strSection .= (defined($strSection) ? ', ' : '') . "'${strSectionTitle}' " . ('Sub' x ($iDepth - 1)) . "Section"; # Create section comment my $strLatex = "% ${strSection}\n% " . ('-' x 130) . "\n"; # Exclude from table of contents if requested if ($iDepth <= 3 && $oSection->paramTest('toc', 'n')) { $strLatex .= '\\addtocontents{toc}{\\protect\\setcounter{tocdepth}{' . ($iDepth - 1) . "}}\n"; } # Create section name $strLatex .= '\\'; if ($iDepth <= 3) { $strLatex .= ($iDepth > 1 ? ('sub' x ($iDepth - 1)) : '') . "section"; } elsif ($iDepth == 4) { $strLatex .= 'paragraph'; } else { confess &log(ASSERT, "section depth of ${iDepth} exceeds maximum"); } $strLatex .= "\{${strSectionTitle}\}\\label{" . $oSection->paramGet('path', false) . "}\n"; # Reset table of contents numbering if the section was excluded if ($iDepth <= 3 && $oSection->paramTest('toc', 'n')) { $strLatex .= '\\addtocontents{toc}{\\protect\\setcounter{tocdepth}{' . $iDepth . "}}\n"; } foreach my $oChild ($oSection->nodeList()) { &log(DEBUG, (' ' x ($iDepth + 2)) . 'process child ' . $oChild->nameGet()); # Execute a command if ($oChild->nameGet() eq 'execute-list') { my $bShow = $oChild->paramTest('show', 'n') ? false : true; my $strHostName = $self->{oManifest}->variableReplace($oChild->paramGet('host')); if ($bShow) { $strLatex .= "\n\\begin\{lstlisting\}[title=\{\\textnormal{\\textbf\{${strHostName}}} --- " . $self->processText($oChild->nodeGet('title')->textGet()) . "}]\n"; } foreach my $oExecute ($oChild->nodeList('execute')) { my $bExeShow = !$oExecute->paramTest('show', 'n'); my ($strCommand, $strOutput) = $self->execute( $oSection, $self->{oManifest}->variableReplace($oChild->paramGet('host')), $oExecute, {iIndent => $iDepth + 3, bShow => $bShow && $bExeShow}); if ($bShow && $bExeShow) { $strLatex .= "${strCommand}\n"; if (defined($strOutput)) { $strLatex .= "\nOutput:\n\n${strOutput}\n"; } } } if ($bShow) { $strLatex .= "\\end{lstlisting}\n"; } } # Add code block elsif ($oChild->nameGet() eq 'code-block') { my $strTitle = $self->{oManifest}->variableReplace($oChild->paramGet("title", false), 'latex'); if (defined($strTitle) && $strTitle eq '') { undef($strTitle) } # Begin the code listing if (!defined($strTitle)) { $strLatex .= "\\vspace{.75em}\n"; } $strLatex .= "\\begin\{lstlisting\}"; # Add the title if one is provided if (defined($strTitle)) { $strLatex .= "[title=\{${strTitle}:\}]"; } # End the code listing $strLatex .= "\n" . trim($oChild->valueGet()) . "\n" . "\\end{lstlisting}\n"; } # Add table elsif ($oChild->nameGet() eq 'table') { my $oHeader; my @oyColumn; if ($oChild->nodeTest('table-header')) { $oHeader = $oChild->nodeGet('table-header'); @oyColumn = $oHeader->nodeList('table-column'); } my $strWidth = '{' . (defined($oHeader) && $oHeader->paramTest('width') ? ($oHeader->paramGet('width') / 100) . '\textwidth' : '\textwidth') . '}'; # Build the table $strLatex .= "\\vspace{1em}\\newline\n\\begin{table}\n\\begin{tabularx}${strWidth}{|"; # Build the table header foreach my $oColumn (@oyColumn) { my $strAlignCode; my $strAlign = $oColumn->paramGet("align", false); # If fill is specified then use X or the custom designed alignments in the preamble to fill and justify the columns. if ($oColumn->paramTest('fill') && $oColumn->paramGet('fill', false) eq 'y') { if (!defined($strAlign) || $strAlign eq 'left') { $strAlignCode = 'X'; } elsif ($strAlign eq 'right') { $strAlignCode = 'R'; } elsif ($strAlign eq 'center') { $strAlignCode = 'C'; } else { confess &log(ERROR, "align '${strAlign}' not valid when fill=y"); } } else { if (!defined($strAlign) || $strAlign eq 'left') { $strAlignCode = 'l'; } elsif ($strAlign eq 'center') { $strAlignCode = 'c'; } elsif ($strAlign eq 'right') { $strAlignCode = 'r'; } else { confess &log(ERROR, "align '${strAlign}' not valid"); } } # $strLatex .= 'p{' . $oColumn->paramGet("width") . '} | '; $strLatex .= $strAlignCode . ' | '; } # If table-header not provided then default the column alignment and fill by using the number of columns in the 1st row if (!defined($oHeader)) { my @oyRow = $oChild->nodeGet('table-data')->nodeList('table-row'); foreach my $oRowCell ($oyRow[0]->nodeList('table-cell')) { $strLatex .= 'X|'; } } $strLatex .= "}\n"; my $strLine; if (defined($oHeader)) { $strLatex .= "\\hline"; $strLatex .= "\\rowcolor{ltgray}\n"; foreach my $oColumn (@oyColumn) { $strLine .= (defined($strLine) ? ' & ' : '') . '\textbf{' . $self->processText($oColumn->textGet()) . '}'; } $strLatex .= "${strLine}\\\\"; } # Build the rows foreach my $oRow ($oChild->nodeGet('table-data')->nodeList('table-row')) { $strLatex .= "\\hline\n"; undef($strLine); foreach my $oRowCell ($oRow->nodeList('table-cell')) { $strLine .= (defined($strLine) ? ' & ' : '') . $self->processText($oRowCell->textGet()); } $strLatex .= "${strLine}\\\\"; } $strLatex .= "\\hline\n\\end{tabularx}\n"; # If there is a title for the table, add it. Ignore the label since LaTex will automatically generate numbered labels. # e.g. Table 1: if ($oChild->nodeGet("title", false)) { $strLatex .= "\\caption{" . $self->processText($oChild->nodeGet("title")->textGet()) . "}\n"; } $strLatex .= "\\end{table}\n"; } # Add descriptive text elsif ($oChild->nameGet() eq 'p') { $strLatex .= "\n\\begin{sloppypar}" . $self->processText($oChild->textGet()) . "\\end{sloppypar}\n"; } # Add option descriptive text elsif ($oChild->nameGet() eq 'option-description') { my $strOption = $oChild->paramGet("key"); my $oDescription = ${$self->{oReference}->{oConfigHash}}{&CONFIG_HELP_OPTION}{$strOption}{&CONFIG_HELP_DESCRIPTION}; if (!defined($oDescription)) { confess &log(ERROR, "unable to find ${strOption} option in sections - try adding option?"); } $strLatex .= "\n\\begin{sloppypar}" . $self->processText($oDescription) . "\\end{sloppypar}\n"; } # Add cmd descriptive text elsif ($oChild->nameGet() eq 'cmd-description') { my $strCommand = $oChild->paramGet("key"); my $oDescription = ${$self->{oReference}->{oConfigHash}}{&CONFIG_HELP_COMMAND}{$strCommand}{&CONFIG_HELP_DESCRIPTION}; if (!defined($oDescription)) { confess &log(ERROR, "unable to find ${strCommand} command in sections - try adding command?"); } $strLatex .= "\n\\begin{sloppypar}" . $self->processText($oDescription) . "\\end{sloppypar}\n"; } # Add a list elsif ($oChild->nameGet() eq 'list') { $strLatex .= "\n\\begin{itemize}"; foreach my $oListItem ($oChild->nodeList()) { $strLatex .= "\n \\item " . $self->processText($oListItem->textGet()); } $strLatex .= "\n\\end{itemize}"; } # Add/remove config options elsif ($oChild->nameGet() eq 'backrest-config' || $oChild->nameGet() eq 'postgres-config') { $strLatex .= $self->configProcess($oSection, $oChild, $iDepth + 3); } # Add a subsection elsif ($oChild->nameGet() eq 'section') { $strLatex .= "\n" . $self->sectionProcess($oChild, $strSection, $iDepth + 1); } # Add an admonition (e.g. NOTE, WARNING, etc) elsif ($oChild->nameGet() eq 'admonition') { $strLatex .= "\n\\vspace{.5em}\\begin{leftbar}"; $strLatex .= "\n\\begin{sloppypar}\\textit{\\textbf{" . uc($oChild->paramGet('type')) . ": }"; $strLatex .= $self->processText($oChild->textGet()) . "}\\end{sloppypar}"; $strLatex .= "\n\\end{leftbar}\n"; } # Check if the child can be processed by a parent else { $self->sectionChildProcess($oSection, $oChild, $iDepth + 1); } } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strSection', value => $strLatex, trace => true} ); } #################################################################################################################################### # configProcess #################################################################################################################################### sub configProcess { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oSection, $oConfig, $iDepth ) = logDebugParam ( __PACKAGE__ . '->configProcess', \@_, {name => 'oSection'}, {name => 'oConfig'}, {name => 'iDepth'} ); # Working variables my $strLatex = ''; my $strFile; my $strConfig; my $bShow = true; # Generate the config if ($oConfig->nameGet() eq 'backrest-config') { ($strFile, $strConfig, $bShow) = $self->backrestConfig($oSection, $oConfig, $iDepth); } else { ($strFile, $strConfig, $bShow) = $self->postgresConfig($oSection, $oConfig, $iDepth); } if ($bShow) { my $strHostName = $self->{oManifest}->variableReplace($oConfig->paramGet('host')); # Replace _ in filename $strFile = $self->variableReplace($strFile); # Render the config $strLatex = "\n\\begin\{lstlisting\}[title=\{\\textnormal{\\textbf\{${strHostName}}}:\\textnormal{\\texttt\{${strFile}}} --- " . $self->processText($oConfig->nodeGet('title')->textGet()) . "}]\n" . (defined($strConfig) ? $strConfig : '') . "\\end{lstlisting}\n"; } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strConfig', value => $strLatex, trace => true} ); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Markdown/000077500000000000000000000000001416457663300227775ustar00rootroot00000000000000pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Markdown/DocMarkdown.pm000066400000000000000000000060201416457663300255430ustar00rootroot00000000000000#################################################################################################################################### # DOC MARKDOWN MODULE #################################################################################################################################### package pgBackRestDoc::Markdown::DocMarkdown; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use Data::Dumper; use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname); use File::Copy; use POSIX qw(strftime); use Storable qw(dclone); use pgBackRestTest::Common::ExecuteTest; use pgBackRestDoc::Common::DocConfig; use pgBackRestDoc::Common::DocManifest; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; use pgBackRestDoc::Markdown::DocMarkdownRender; use pgBackRestDoc::ProjectInfo; #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Create the class hash my $self = {}; bless $self, $class; $self->{strClass} = $class; # Assign function parameters, defaults, and log debug info ( my $strOperation, $self->{oManifest}, $self->{strXmlPath}, $self->{strMarkdownPath}, $self->{bExe} ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'oManifest'}, {name => 'strXmlPath'}, {name => 'strMarkdownPath'}, {name => 'bExe'} ); # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # process # # Generate the site html #################################################################################################################################### sub process { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->process'); foreach my $strRenderOutId ($self->{oManifest}->renderOutList(RENDER_TYPE_MARKDOWN)) { my $oRenderOut = $self->{oManifest}->renderOutGet(RENDER_TYPE_MARKDOWN, $strRenderOutId); my $strFile = "$self->{strMarkdownPath}/" . (defined($$oRenderOut{file}) ? $$oRenderOut{file} : "${strRenderOutId}.md"); &log(INFO, " render out: ${strRenderOutId}"); # Save the html page $self->{oManifest}->storage()->put( $strFile, $self->{oManifest}->variableReplace((new pgBackRestDoc::Markdown::DocMarkdownRender($self->{oManifest}, $strRenderOutId, $self->{bExe}))->process())); } # Return from function and log return values if any logDebugReturn($strOperation); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/Markdown/DocMarkdownRender.pm000066400000000000000000000473231416457663300267160ustar00rootroot00000000000000#################################################################################################################################### # DOC MARKDOWN RENDER MODULE #################################################################################################################################### package pgBackRestDoc::Markdown::DocMarkdownRender; use parent 'pgBackRestDoc::Common::DocExecute'; use strict; use warnings FATAL => qw(all); use Carp qw(confess); use Data::Dumper; use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname); use File::Copy; use Storable qw(dclone); use pgBackRestDoc::Common::DocConfig; use pgBackRestDoc::Common::DocManifest; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; #################################################################################################################################### # CONSTRUCTOR #################################################################################################################################### sub new { my $class = shift; # Class name # Assign function parameters, defaults, and log debug info my ( $strOperation, $oManifest, $strRenderOutKey, $bExe ) = logDebugParam ( __PACKAGE__ . '->new', \@_, {name => 'oManifest'}, {name => 'strRenderOutKey'}, {name => 'bExe'} ); # Create the class hash my $self = $class->SUPER::new(RENDER_TYPE_MARKDOWN, $oManifest, $strRenderOutKey, $bExe); bless $self, $class; # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'self', value => $self} ); } #################################################################################################################################### # process # # Generate the site html #################################################################################################################################### sub process { my $self = shift; # Assign function parameters, defaults, and log debug info my $strOperation = logDebugParam(__PACKAGE__ . '->process'); # Working variables my $oPage = $self->{oDoc}; # Initialize page my $strMarkdown = "# " . $oPage->paramGet('title'); if (defined($oPage->paramGet('subtitle', false))) { $strMarkdown .= '
    ' . $oPage->paramGet('subtitle') . ''; } # my $oHtmlBuilder = new pgBackRestDoc::Html::DocHtmlBuilder("{[project]} - Reliable PostgreSQL Backup", # $strTitle . (defined($strSubTitle) ? " - ${strSubTitle}" : ''), # $self->{bPretty}); # # # Generate header # my $oPageHeader = $oHtmlBuilder->bodyGet()->addNew(HTML_DIV, 'page-header'); # # $oPageHeader-> # addNew(HTML_DIV, 'page-header-title', # {strContent => $strTitle}); # # if (defined($strSubTitle)) # { # $oPageHeader-> # addNew(HTML_DIV, 'page-header-subtitle', # {strContent => $strSubTitle}); # } # # # Generate menu # my $oMenuBody = $oHtmlBuilder->bodyGet()->addNew(HTML_DIV, 'page-menu')->addNew(HTML_DIV, 'menu-body'); # # if ($self->{strRenderOutKey} ne 'index') # { # my $oRenderOut = $self->{oManifest}->renderOutGet(RENDER_TYPE_HTML, 'index'); # # $oMenuBody-> # addNew(HTML_DIV, 'menu')-> # addNew(HTML_A, 'menu-link', {strContent => $$oRenderOut{menu}, strRef => '{[project-url-root]}'}); # } # # foreach my $strRenderOutKey ($self->{oManifest}->renderOutList(RENDER_TYPE_HTML)) # { # if ($strRenderOutKey ne $self->{strRenderOutKey} && $strRenderOutKey ne 'index') # { # my $oRenderOut = $self->{oManifest}->renderOutGet(RENDER_TYPE_HTML, $strRenderOutKey); # # $oMenuBody-> # addNew(HTML_DIV, 'menu')-> # addNew(HTML_A, 'menu-link', {strContent => $$oRenderOut{menu}, strRef => "${strRenderOutKey}.html"}); # } # } # # # Generate table of contents # my $oPageTocBody; # # if (!defined($oPage->paramGet('toc', false)) || $oPage->paramGet('toc') eq 'y') # { # my $oPageToc = $oHtmlBuilder->bodyGet()->addNew(HTML_DIV, 'page-toc'); # # $oPageToc-> # addNew(HTML_DIV, 'page-toc-title', # {strContent => "Table of Contents"}); # # $oPageTocBody = $oPageToc-> # addNew(HTML_DIV, 'page-toc-body'); # } # # # Generate body # my $oPageBody = $oHtmlBuilder->bodyGet()->addNew(HTML_DIV, 'page-body'); # Render sections foreach my $oSection ($oPage->nodeList('section')) { $strMarkdown = trim($strMarkdown) . "\n\n" . $self->sectionProcess($oSection, 1); } $strMarkdown .= "\n"; # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strMarkdown', value => $strMarkdown, trace => true} ); } #################################################################################################################################### # sectionProcess #################################################################################################################################### sub sectionProcess { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oSection, $iDepth ) = logDebugParam ( __PACKAGE__ . '->sectionProcess', \@_, {name => 'oSection'}, {name => 'iDepth'} ); if ($oSection->paramGet('log')) { &log(INFO, (' ' x ($iDepth + 1)) . 'process section: ' . $oSection->paramGet('path')); } if ($iDepth > 3) { confess &log(ASSERT, "section depth of ${iDepth} exceeds maximum"); } my $strMarkdown = '#' . ('#' x $iDepth) . ' ' . $self->processText($oSection->nodeGet('title')->textGet()); my $strLastChild = undef; foreach my $oChild ($oSection->nodeList()) { &log(DEBUG, (' ' x ($iDepth + 2)) . 'process child ' . $oChild->nameGet()); # Execute a command if ($oChild->nameGet() eq 'execute-list') { my $bShow = $oChild->paramTest('show', 'n') ? false : true; my $bFirst = true; my $strHostName = $self->{oManifest}->variableReplace($oChild->paramGet('host')); my $bOutput = false; if ($bShow) { $strMarkdown .= "\n\n${strHostName} => " . $self->processText($oChild->nodeGet('title')->textGet()) . "\n```\n"; } foreach my $oExecute ($oChild->nodeList('execute')) { my $bExeShow = !$oExecute->paramTest('show', 'n'); my $bExeExpectedError = defined($oExecute->paramGet('err-expect', false)); if ($bOutput) { confess &log(ERROR, "only the last command can have output"); } my ($strCommand, $strOutput) = $self->execute( $oSection, $strHostName, $oExecute, {iIndent => $iDepth + 3, bShow => $bShow && $bExeShow}); if ($bShow && $bExeShow) { # Add continuation chars and proper spacing $strCommand =~ s/\n/\n /smg; $strMarkdown .= "${strCommand}\n"; my $strHighLight = $self->{oManifest}->variableReplace($oExecute->fieldGet('exe-highlight', false)); my $bHighLightFound = false; if (defined($strOutput)) { $strMarkdown .= "\n--- output ---\n\n"; if ($oExecute->fieldTest('exe-highlight-type', 'error')) { $bExeExpectedError = true; } foreach my $strLine (split("\n", $strOutput)) { my $bHighLight = defined($strHighLight) && $strLine =~ /$strHighLight/; if ($bHighLight) { $strMarkdown .= $bExeExpectedError ? "ERR" : "-->"; } else { $strMarkdown .= " "; } $strMarkdown .= " ${strLine}\n"; $bHighLightFound = $bHighLightFound ? true : $bHighLight ? true : false; } $bFirst = true; } if ($self->{bExe} && $self->isRequired($oSection) && defined($strHighLight) && !$bHighLightFound) { confess &log(ERROR, "unable to find a match for highlight: ${strHighLight}"); } } $bFirst = false; } $strMarkdown .= "```"; } # Add code block elsif ($oChild->nameGet() eq 'code-block') { if ($oChild->paramTest('title')) { if (defined($strLastChild) && $strLastChild ne 'code-block') { $strMarkdown .= "\n"; } $strMarkdown .= "\n_" . $oChild->paramGet('title') . "_:"; } $strMarkdown .= "\n```"; if ($oChild->paramTest('type')) { $strMarkdown .= $oChild->paramGet('type'); } $strMarkdown .= "\n" . trim($oChild->valueGet()) . "\n```"; } # Add descriptive text elsif ($oChild->nameGet() eq 'p') { if (defined($strLastChild) && $strLastChild ne 'code-block' && $strLastChild ne 'table') { $strMarkdown .= "\n"; } $strMarkdown .= "\n" . $self->processText($oChild->textGet()); } # Add option descriptive text elsif ($oChild->nameGet() eq 'option-description') { # my $strOption = $oChild->paramGet("key"); # my $oDescription = ${$self->{oReference}->{oConfigHash}}{&CONFIG_HELP_OPTION}{$strOption}{&CONFIG_HELP_DESCRIPTION}; # # if (!defined($oDescription)) # { # confess &log(ERROR, "unable to find ${strOption} option in sections - try adding command?"); # } # # $oSectionBodyElement-> # addNew(HTML_DIV, 'section-body-text', # {strContent => $self->processText($oDescription)}); } # Add/remove backrest config options elsif ($oChild->nameGet() eq 'backrest-config') { # my $oConfigElement = $self->backrestConfigProcess($oSection, $oChild, $iDepth + 3); # # if (defined($oConfigElement)) # { # $oSectionBodyElement->add($oConfigElement); # } } # Add/remove postgres config options elsif ($oChild->nameGet() eq 'postgres-config') { # my $oConfigElement = $self->postgresConfigProcess($oSection, $oChild, $iDepth + 3); # # if (defined($oConfigElement)) # { # $oSectionBodyElement->add($oConfigElement); # } } # Add a list elsif ($oChild->nameGet() eq 'list') { foreach my $oListItem ($oChild->nodeList()) { $strMarkdown .= "\n\n- " . $self->processText($oListItem->textGet()); } } # Add a subsection elsif ($oChild->nameGet() eq 'section') { $strMarkdown = trim($strMarkdown) . "\n\n" . $self->sectionProcess($oChild, $iDepth + 1); } elsif ($oChild->nameGet() eq 'table') { my $oTableTitle; if ($oChild->nodeTest('title')) { $oTableTitle = $oChild->nodeGet('title'); } my $oHeader; my @oyColumn; if ($oChild->nodeTest('table-header')) { $oHeader = $oChild->nodeGet('table-header'); @oyColumn = $oHeader->nodeList('table-column'); } if (defined($oTableTitle)) { # Print the label (e.g. Table 1:) in front of the title if one exists $strMarkdown .= "\n\n**" . ($oTableTitle->paramTest('label') ? ($oTableTitle->paramGet('label') . ': ' . $self->processText($oTableTitle->textGet())) : $self->processText($oTableTitle->textGet())) . "**\n\n"; } else { $strMarkdown .= "\n\n"; } my $strHeaderText = "| "; my $strHeaderIndicator = "| "; for (my $iColCellIdx = 0; $iColCellIdx < @oyColumn; $iColCellIdx++) { my $strAlign = $oyColumn[$iColCellIdx]->paramGet("align", false, 'left'); $strHeaderText .= $self->processText($oyColumn[$iColCellIdx]->textGet()) . (($iColCellIdx < @oyColumn - 1) ? " | " : " |\n"); $strHeaderIndicator .= ($strAlign eq 'left' || $strAlign eq 'center') ? ":---" : "---"; $strHeaderIndicator .= ($strAlign eq 'right' || $strAlign eq 'center') ? "---:" : ""; $strHeaderIndicator .= ($iColCellIdx < @oyColumn - 1) ? " | " : " |\n"; } # Markdown requires a table header so if not provided then create an empty header row and default the column alignment # left by using the number of columns in the 1st row if (!defined($oHeader)) { my @oyRow = $oChild->nodeGet('table-data')->nodeList('table-row'); foreach my $oRowCell ($oyRow[0]->nodeList('table-cell')) { $strHeaderText .= " | "; $strHeaderIndicator .= ":--- | "; } $strHeaderText .= "\n"; $strHeaderIndicator .= "\n"; } $strMarkdown .= (defined($strHeaderText) ? $strHeaderText : '') . $strHeaderIndicator; # Build the rows foreach my $oRow ($oChild->nodeGet('table-data')->nodeList('table-row')) { my @oRowCellList = $oRow->nodeList('table-cell'); $strMarkdown .= "| "; for (my $iRowCellIdx = 0; $iRowCellIdx < @oRowCellList; $iRowCellIdx++) { my $oRowCell = $oRowCellList[$iRowCellIdx]; $strMarkdown .= $self->processText($oRowCell->textGet()) . (($iRowCellIdx < @oRowCellList -1) ? " | " : " |\n"); } } } # Add an admonition (e.g. NOTE, WARNING, etc) elsif ($oChild->nameGet() eq 'admonition') { $strMarkdown .= "\n> **" . uc($oChild->paramGet('type')) . ":** " . $self->processText($oChild->textGet()); } # Check if the child can be processed by a parent else { $self->sectionChildProcess($oSection, $oChild, $iDepth + 1); } $strLastChild = $oChild->nameGet(); } # Return from function and log return values if any return logDebugReturn ( $strOperation, {name => 'strMarkdown', value => $strMarkdown, trace => true} ); } #################################################################################################################################### # backrestConfigProcess #################################################################################################################################### sub backrestConfigProcess { my $self = shift; # Assign function parameters, defaults, and log debug info my ( $strOperation, $oSection, $oConfig, $iDepth ) = logDebugParam ( __PACKAGE__ . '->backrestConfigProcess', \@_, {name => 'oSection'}, {name => 'oConfig'}, {name => 'iDepth'} ); # # Generate the config # my $oConfigElement; # my ($strFile, $strConfig, $bShow) = $self->backrestConfig($oSection, $oConfig, $iDepth); # # if ($bShow) # { # my $strHostName = $self->{oManifest}->variableReplace($oConfig->paramGet('host')); # # # Render the config # $oConfigElement = new pgBackRestDoc::Html::DocHtmlElement(HTML_DIV, "config"); # # $oConfigElement-> # addNew(HTML_DIV, "config-title", # {strContent => "${strHostName}:${strFile}" . # " " . $self->processText($oConfig->nodeGet('title')->textGet())}); # # my $oConfigBodyElement = $oConfigElement->addNew(HTML_DIV, "config-body"); # # # # $oConfigBodyElement-> # # addNew(HTML_DIV, "config-body-title", # # {strContent => "${strFile}:"}); # # $oConfigBodyElement-> # addNew(HTML_DIV, "config-body-output", # {strContent => $strConfig}); # } # # # Return from function and log return values if any # return logDebugReturn # ( # $strOperation, # {name => 'oConfigElement', value => $oConfigElement, trace => true} # ); } #################################################################################################################################### # postgresConfigProcess #################################################################################################################################### sub postgresConfigProcess { my $self = shift; # # Assign function parameters, defaults, and log debug info # my # ( # $strOperation, # $oSection, # $oConfig, # $iDepth # ) = # logDebugParam # ( # __PACKAGE__ . '->postgresConfigProcess', \@_, # {name => 'oSection'}, # {name => 'oConfig'}, # {name => 'iDepth'} # ); # # # Generate the config # my $oConfigElement; # my ($strFile, $strConfig, $bShow) = $self->postgresConfig($oSection, $oConfig, $iDepth); # # if ($bShow) # { # # Render the config # my $strHostName = $self->{oManifest}->variableReplace($oConfig->paramGet('host')); # $oConfigElement = new pgBackRestDoc::Html::DocHtmlElement(HTML_DIV, "config"); # # $oConfigElement-> # addNew(HTML_DIV, "config-title", # {strContent => "${strHostName}:${strFile}" . # " " . $self->processText($oConfig->nodeGet('title')->textGet())}); # # my $oConfigBodyElement = $oConfigElement->addNew(HTML_DIV, "config-body"); # # # $oConfigBodyElement-> # # addNew(HTML_DIV, "config-body-title", # # {strContent => "append to ${strFile}:"}); # # $oConfigBodyElement-> # addNew(HTML_DIV, "config-body-output", # {strContent => defined($strConfig) ? $strConfig : ''}); # # $oConfig->fieldSet('actual-config', $strConfig); # } # # # Return from function and log return values if any # return logDebugReturn # ( # $strOperation, # {name => 'oConfigElement', value => $oConfigElement, trace => true} # ); } 1; pgbackrest-release-2.37/doc/lib/pgBackRestDoc/ProjectInfo.pm000066400000000000000000000054701416457663300240030ustar00rootroot00000000000000#################################################################################################################################### # PROJECT INFO MODULE # # Contains project name, version and format. #################################################################################################################################### package pgBackRestDoc::ProjectInfo; use strict; use warnings FATAL => qw(all); use Cwd qw(abs_path); use Exporter qw(import); our @EXPORT = qw(); use File::Basename qw(dirname); # Project Name # # Defines the official project name, exe, and config file. #----------------------------------------------------------------------------------------------------------------------------------- push @EXPORT, qw(PROJECT_NAME); push @EXPORT, qw(PROJECT_EXE); push @EXPORT, qw(PROJECT_CONF); # Project Version Number # # Defines the current version of the BackRest executable. The version number is used to track features but does not affect what # repositories or manifests can be read - that's the job of the format number. #----------------------------------------------------------------------------------------------------------------------------------- push @EXPORT, qw(PROJECT_VERSION); # Repository Format Number # # Defines format for info and manifest files as well as on-disk structure. If this number changes then the repository will be # invalid unless migration functions are written. #----------------------------------------------------------------------------------------------------------------------------------- push @EXPORT, qw(REPOSITORY_FORMAT); #################################################################################################################################### # Load project info from src/version.h #################################################################################################################################### require pgBackRestTest::Common::Storage; require pgBackRestTest::Common::StoragePosix; my $strProjectInfo = ${new pgBackRestTest::Common::Storage( dirname(dirname(abs_path($0))), new pgBackRestTest::Common::StoragePosix())->get('src/version.h')}; foreach my $strLine (split("\n", $strProjectInfo)) { if ($strLine =~ /^#define PROJECT_NAME/) { eval("use constant PROJECT_NAME => " . (split(" ", $strLine))[-1]); } elsif ($strLine =~ /^#define PROJECT_BIN/) { eval("use constant PROJECT_EXE => " . (split(" ", $strLine))[-1]); eval("use constant PROJECT_CONF => " . (split(" ", $strLine))[-1] . " . \'.conf\'"); } elsif ($strLine =~ /^#define PROJECT_VERSION/) { eval("use constant PROJECT_VERSION => " . (split(" ", $strLine))[-1]); } elsif ($strLine =~ /^#define REPOSITORY_FORMAT/) { eval("use constant REPOSITORY_FORMAT => " . (split(" ", $strLine))[-1]); } } 1; pgbackrest-release-2.37/doc/manifest.xml000066400000000000000000000132751416457663300203340ustar00rootroot00000000000000 pgBackRest Reliable PostgreSQL Backup & Restore use pgBackRestDoc::ProjectInfo; PROJECT_VERSION use pgBackRestDoc::Custom::DocCustomRelease; (new pgBackRestDoc::Custom::DocCustomRelease( new pgBackRestDoc::Common::Doc("{[doc-path]}/xml/release.xml")))->currentStableVersion(); pgbackrest / PostgreSQL - logo.png favicon.png n n n n use Time::Local; use pgBackRestDoc::Custom::DocCustomRelease; my ($second, $minute , $hour, $mday, $month, $year) = localtime(); $year += 1900; if ('{[release-date-static]}' eq 'y') { my $strDate = (new pgBackRestDoc::Custom::DocCustomRelease( new pgBackRestDoc::Common::Doc("{[doc-path]}/xml/release.xml")))->releaseLast()->paramGet('date'); if ($strDate eq 'XXXX-XX-XX') { confess &log(ERROR, 'not possible to use static release dates on a dev build'); } else { ($year, $month, $mday) = split(/[\s.\-]+/, $strDate); $month -= 1; } } my @stryMonth = ('January', 'February', 'March', 'April', 'May', 'June', 'July', 'August', 'September', 'October', 'November', 'December'); $stryMonth[$month] . ' ' . $mday . ', ' . $year; 'Copyright &copy; 2015' . '-' . substr('{[release-date]}', length('{[release-date]}') - 4) . ', The PostgreSQL Global Development Group, <a href="{[github-url-license]}">MIT License</a>. Updated ' . '{[release-date]}'; {[doc-path]}/output/latex/logo {[project]} User Guide Open Source PostgreSQL Backup and Restore Utility Version {[version]} Crunchy Data Solutions, Inc. cds-logo.eps {[pdf-title1]}\\{[pdf-title3]} \ \\-\ \thepage\ - {[pdf-organization]}\\\today CrunchyBackRest-UserGuide-{[version]} pgbackrest-release-2.37/doc/release.pl000077500000000000000000000313001416457663300177510ustar00rootroot00000000000000#!/usr/bin/perl #################################################################################################################################### # release.pl - PgBackRest Release Manager #################################################################################################################################### #################################################################################################################################### # Perl includes #################################################################################################################################### use strict; use warnings FATAL => qw(all); use Carp qw(confess); use English '-no_match_vars'; $SIG{__DIE__} = sub { Carp::confess @_ }; use Cwd qw(abs_path); use File::Basename qw(dirname); use Getopt::Long qw(GetOptions); use Pod::Usage qw(pod2usage); use Storable; use lib dirname($0) . '/lib'; use lib dirname(dirname($0)) . '/build/lib'; use lib dirname(dirname($0)) . '/lib'; use lib dirname(dirname($0)) . '/test/lib'; use pgBackRestTest::Common::ExecuteTest; use pgBackRestTest::Common::Storage; use pgBackRestTest::Common::StoragePosix; use pgBackRestTest::Common::VmTest; use pgBackRestDoc::Common::Doc; use pgBackRestDoc::Common::DocConfig; use pgBackRestDoc::Common::DocManifest; use pgBackRestDoc::Common::DocRender; use pgBackRestDoc::Common::Exception; use pgBackRestDoc::Common::Log; use pgBackRestDoc::Common::String; use pgBackRestDoc::Custom::DocCustomRelease; use pgBackRestDoc::Html::DocHtmlSite; use pgBackRestDoc::Latex::DocLatex; use pgBackRestDoc::Markdown::DocMarkdown; use pgBackRestDoc::ProjectInfo; #################################################################################################################################### # Usage #################################################################################################################################### =head1 NAME release.pl - pgBackRest Release Manager =head1 SYNOPSIS release.pl [options] General Options: --help Display usage and exit --version Display pgBackRest version --quiet Sets log level to ERROR --log-level Log level for execution (e.g. ERROR, WARN, INFO, DEBUG) Release Options: --build Build the cache before release (should be included in the release commit) --deploy Deploy documentation to website (can be done as docs are updated) --no-gen Don't auto-generate --vm vm to build documentation for =cut #################################################################################################################################### # Load command line parameters and config (see usage above for details) #################################################################################################################################### my $bHelp = false; my $bVersion = false; my $bQuiet = false; my $strLogLevel = 'info'; my $bBuild = false; my $bDeploy = false; my $bNoGen = false; my $strVm = undef; GetOptions ('help' => \$bHelp, 'version' => \$bVersion, 'quiet' => \$bQuiet, 'log-level=s' => \$strLogLevel, 'build' => \$bBuild, 'deploy' => \$bDeploy, 'no-gen' => \$bNoGen, 'vm=s' => \$strVm) or pod2usage(2); #################################################################################################################################### # Run in eval block to catch errors #################################################################################################################################### eval { # Display version and exit if requested if ($bHelp || $bVersion) { print PROJECT_NAME . ' ' . PROJECT_VERSION . " Release Manager\n"; if ($bHelp) { print "\n"; pod2usage(); } exit 0; } # If neither build nor deploy is requested then error if (!$bBuild && !$bDeploy) { confess &log(ERROR, 'neither --build nor --deploy requested, nothing to do'); } # Set console log level if ($bQuiet) { $strLogLevel = 'error'; } logLevelSet(undef, uc($strLogLevel), OFF); # Set the paths my $strDocPath = dirname(abs_path($0)); my $strDocHtml = "${strDocPath}/output/html"; my $strDocExe = "${strDocPath}/doc.pl"; my $strTestExe = dirname($strDocPath) . "/test/test.pl"; my $oStorageDoc = new pgBackRestTest::Common::Storage( $strDocPath, new pgBackRestTest::Common::StoragePosix({bFileSync => false, bPathSync => false})); # Determine if this is a dev release my $bDev = PROJECT_VERSION =~ /dev$/; my $strVersion = $bDev ? 'dev' : PROJECT_VERSION; # Make sure version number matches the latest release &log(INFO, "check version info"); my $strReleaseFile = dirname(dirname(abs_path($0))) . '/doc/xml/release.xml'; my $oRelease = (new pgBackRestDoc::Custom::DocCustomRelease(new pgBackRestDoc::Common::Doc($strReleaseFile)))->releaseLast(); if ($oRelease->paramGet('version') ne PROJECT_VERSION) { confess 'unable to find version ' . PROJECT_VERSION . " as the most recent release in ${strReleaseFile}"; } if ($bBuild) { if (!$bNoGen) { # Update git history my $strGitCommand = 'git -C ' . $strDocPath . ' log --pretty=format:\'{^^^^commit^^^^:^^^^%H^^^^,^^^^date^^^^:^^^^%ci^^^^,^^^^subject^^^^:^^^^%s^^^^,^^^^body^^^^:^^^^%b^^^^},\''; my $strGitLog = qx($strGitCommand); $strGitLog =~ s/\^\^\^\^\}\,\n/\#\#\#\#/mg; $strGitLog =~ s/\\/\\\\/g; $strGitLog =~ s/\n/\\n/mg; $strGitLog =~ s/\r/\\r/mg; $strGitLog =~ s/\t/\\t/mg; $strGitLog =~ s/\"/\\\"/g; $strGitLog =~ s/\^\^\^\^/\"/g; $strGitLog =~ s/\#\#\#\#/\"\}\,\n/mg; $strGitLog = '[' . substr($strGitLog, 0, length($strGitLog) - 1) . ']'; my @hyGitLog = @{(JSON::PP->new()->allow_nonref())->decode($strGitLog)}; # Load prior history my @hyGitLogPrior = @{(JSON::PP->new()->allow_nonref())->decode( ${$oStorageDoc->get("${strDocPath}/resource/git-history.cache")})}; # Add new commits for (my $iGitLogIdx = @hyGitLog - 1; $iGitLogIdx >= 0; $iGitLogIdx--) { my $rhGitLog = $hyGitLog[$iGitLogIdx]; my $bFound = false; foreach my $rhGitLogPrior (@hyGitLogPrior) { if ($rhGitLog->{commit} eq $rhGitLogPrior->{commit}) { $bFound = true; } } next if $bFound; $rhGitLog->{body} = trim($rhGitLog->{body}); if ($rhGitLog->{body} eq '') { delete($rhGitLog->{body}); } unshift(@hyGitLogPrior, $rhGitLog); } # Write git log $strGitLog = undef; foreach my $rhGitLog (@hyGitLogPrior) { $strGitLog .= (defined($strGitLog) ? ",\n" : '') . " {\n" . ' "commit": ' . trim((JSON::PP->new()->allow_nonref()->pretty())->encode($rhGitLog->{commit})) . ",\n" . ' "date": ' . trim((JSON::PP->new()->allow_nonref()->pretty())->encode($rhGitLog->{date})) . ",\n" . ' "subject": ' . trim((JSON::PP->new()->allow_nonref()->pretty())->encode($rhGitLog->{subject})); # Skip the body if it is empty or a release (since we already have the release note content) if ($rhGitLog->{subject} !~ /^v[0-9]{1,2}\.[0-9]{1,2}\: /g && defined($rhGitLog->{body})) { $strGitLog .= ",\n" . ' "body": ' . trim((JSON::PP->new()->allow_nonref()->pretty())->encode($rhGitLog->{body})); } $strGitLog .= "\n" . " }"; } $oStorageDoc->put("${strDocPath}/resource/git-history.cache", "[\n${strGitLog}\n]\n"); # Generate coverage summary &log(INFO, "Generate Coverage Summary"); executeTest("${strTestExe} --vm=f33 --no-valgrind --clean --coverage-summary", {bShowOutputAsync => true}); } # Remove permanent cache file $oStorageDoc->remove("${strDocPath}/resource/exe.cache", {bIgnoreMissing => true}); # Remove all docker containers to get consistent IP address assignments executeTest('docker rm -f $(docker ps -a -q)', {bSuppressError => true}); # Generate deployment docs for RHEL if (!defined($strVm) || $strVm eq VM_RH8) { &log(INFO, "Generate RHEL documentation"); executeTest("${strDocExe} --deploy --key-var=os-type=rhel --out=pdf", {bShowOutputAsync => true}); if (!defined($strVm)) { executeTest("${strDocExe} --deploy --cache-only --key-var=os-type=rhel --out=pdf"); } } # Generate deployment docs for Debian if (!defined($strVm) || $strVm eq VM_U18) { &log(INFO, "Generate Debian/Ubuntu documentation"); executeTest("${strDocExe} --deploy --out=man --out=html --out=markdown", {bShowOutputAsync => true}); } # Generate a full copy of the docs for review if (!defined($strVm)) { &log(INFO, "Generate full documentation for review"); executeTest( "${strDocExe} --deploy --out-preserve --cache-only --key-var=os-type=rhel --out=html" . " --var=project-url-root=index.html"); $oStorageDoc->move("$strDocHtml/user-guide.html", "$strDocHtml/user-guide-rhel.html"); executeTest( "${strDocExe} --deploy --out-preserve --cache-only --out=man --out=html --var=project-url-root=index.html"); } } if ($bDeploy) { my $strDeployPath = "${strDocPath}/site"; # Generate docs for the website history &log(INFO, 'Generate website ' . ($bDev ? 'dev' : 'history') . ' documentation'); my $strDocExeVersion = ${strDocExe} . ($bDev ? ' --dev' : ' --deploy --cache-only') . ' --var=project-url-root=index.html --out=html'; executeTest("${strDocExeVersion} --out-preserve --key-var=os-type=rhel"); $oStorageDoc->move("$strDocHtml/user-guide.html", "$strDocHtml/user-guide-rhel.html"); $oStorageDoc->remove("$strDocHtml/release.html"); executeTest("${strDocExeVersion} --out-preserve --exclude=release"); # Deploy to repository &log(INFO, '...Deploy to repository'); executeTest("rm -rf ${strDeployPath}/prior/${strVersion}"); executeTest("mkdir ${strDeployPath}/prior/${strVersion}"); executeTest("cp ${strDocHtml}/* ${strDeployPath}/prior/${strVersion}"); # Generate docs for the main website if (!$bDev) { &log(INFO, "Generate website documentation"); executeTest("${strDocExe} --var=analytics=y --deploy --cache-only --key-var=os-type=rhel --out=html"); $oStorageDoc->move("$strDocHtml/user-guide.html", "$strDocHtml/user-guide-rhel.html"); executeTest("${strDocExe} --var=analytics=y --deploy --out-preserve --cache-only --out=html"); # Deploy to repository &log(INFO, '...Deploy to repository'); executeTest("rm -rf ${strDeployPath}/dev"); executeTest("find ${strDeployPath} -maxdepth 1 -type f -exec rm {} +"); executeTest("cp ${strDocHtml}/* ${strDeployPath}"); executeTest("cp ${strDocPath}/../README.md ${strDeployPath}"); executeTest("cp ${strDocPath}/../LICENSE ${strDeployPath}"); } # Update permissions executeTest("find ${strDeployPath} -type d -exec chmod 750 {} +"); executeTest("find ${strDeployPath} -type f -exec chmod 640 {} +"); } # Exit with success exit 0; } #################################################################################################################################### # Check for errors #################################################################################################################################### or do { # If a backrest exception then return the code exit $EVAL_ERROR->code() if (isException(\$EVAL_ERROR)); # Else output the unhandled error print $EVAL_ERROR; exit ERROR_UNHANDLED; }; # It shouldn't be possible to get here &log(ASSERT, 'execution reached invalid location in ' . __FILE__ . ', line ' . __LINE__); exit ERROR_ASSERT; pgbackrest-release-2.37/doc/resource/000077500000000000000000000000001416457663300176235ustar00rootroot00000000000000pgbackrest-release-2.37/doc/resource/exe.cache000066400000000000000000016375301416457663300214100ustar00rootroot00000000000000{ "default" : { "all" : { "contributing" : [ { "key" : { "id" : "contrib", "image" : "pgbackrest/doc:contrib", "name" : "pgbackrest-dev", "option" : "-v /var/run/docker.sock:/var/run/docker.sock -v /home/vagrant/test:/home/vagrant/test", "os" : "u20", "update-hosts" : true }, "type" : "host", "value" : { "ip" : "172.17.0.8" } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo apt-get update" ], "host" : "pgbackrest-dev", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo apt-get install rsync git devscripts build-essential valgrind lcov autoconf \\", " autoconf-archive libssl-dev zlib1g-dev libxml2-dev libpq-dev pkg-config \\", " libxml-checker-perl libyaml-perl libdbd-pg-perl liblz4-dev liblz4-tool \\", " zstd libzstd-dev bzip2 libbz2-dev libyaml-dev" ], "cmd-extra" : "-y 2>&1", "host" : "pgbackrest-dev", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "pgbackrest/test/test.pl --clean-only" ], "host" : "pgbackrest-dev", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "curl -fsSL https://get.docker.com | sudo sh" ], "cmd-extra" : "2>&1", "host" : "pgbackrest-dev", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo usermod -aG docker `whoami`" ], "host" : "pgbackrest-dev", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 666 /var/run/docker.sock" ], "host" : "pgbackrest-dev", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "pgbackrest/test/test.pl --vm=none --dry-run" ], "cmd-extra" : "--no-log-timestamp", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "[0-9]+ tests selected|DRY RUN COMPLETED SUCCESSFULLY" ] }, "host" : "pgbackrest-dev", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: test begin on x86_64 - log level info", "P00 INFO: configure build", "P00 INFO: builds required: bin", "P00 INFO: 74 tests selected", " ", "P00 INFO: P1-T01/74 - vm=none, module=common, test=error", " [filtered 71 lines of output]", "P00 INFO: P1-T73/74 - vm=none, module=performance, test=type", "P00 INFO: P1-T74/74 - vm=none, module=performance, test=storage", "P00 INFO: DRY RUN COMPLETED SUCCESSFULLY" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "pgbackrest/test/test.pl --vm=none --vm-out --module=common --test=wait" ], "cmd-extra" : "--no-log-timestamp", "host" : "pgbackrest-dev", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: test begin on x86_64 - log level info", "P00 INFO: autogenerate configure", "P00 INFO: autogenerated version in configure.ac script: no changes", "P00 INFO: autogenerated configure script: no changes", "P00 INFO: autogenerate code", "P00 INFO: cleanup old data", "P00 INFO: builds required: none", "P00 INFO: 1 test selected", " ", "P00 INFO: P1-T1/1 - vm=none, module=common, test=wait", " ", " run 1 - waitNew(), waitMore, and waitFree()", " L0018 expect AssertError: assertion 'waitTime <= 999999000' failed", " ", " run 1/1 ------------- L0021 0ms wait", " L0025 new wait", " L0026 check remaining time", " L0027 check wait time", " L0028 check sleep time", " L0029 check sleep prev time", " L0030 no wait more", " L0033 new wait = 0.2 sec", " L0034 check remaining time", " L0035 check wait time", " L0036 check sleep time", " L0037 check sleep prev time", " L0038 check begin time", " L0044 lower range check", " L0045 upper range check", " L0047 free wait", " L0052 new wait = 1.1 sec", " L0053 check wait time", " L0054 check sleep time", " L0055 check sleep prev time", " L0056 check begin time", " L0062 lower range check", " L0063 upper range check", " L0065 free wait", " ", " TESTS COMPLETED SUCCESSFULLY", "", "P00 INFO: P1-T1/1 - vm=none, module=common, test=wait", "P00 INFO: tested modules have full coverage", "P00 INFO: writing C coverage report", "P00 INFO: TESTS COMPLETED SUCCESSFULLY" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "pgbackrest/test/test.pl --vm=none --module=postgres" ], "cmd-extra" : "--no-log-timestamp", "host" : "pgbackrest-dev", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: test begin on x86_64 - log level info", "P00 INFO: autogenerate configure", "P00 INFO: autogenerated version in configure.ac script: no changes", "P00 INFO: autogenerated configure script: no changes", "P00 INFO: autogenerate code", "P00 INFO: cleanup old data", "P00 INFO: builds required: none", "P00 INFO: 2 tests selected", " ", "P00 INFO: P1-T1/2 - vm=none, module=postgres, test=client", "P00 INFO: P1-T2/2 - vm=none, module=postgres, test=interface", "P00 INFO: tested modules have full coverage", "P00 INFO: writing C coverage report", "P00 INFO: TESTS COMPLETED SUCCESSFULLY" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "pgbackrest/test/test.pl --vm-build --vm=u20" ], "cmd-extra" : "--no-log-timestamp", "host" : "pgbackrest-dev", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: test begin on x86_64 - log level info", "P00 INFO: Using cached pgbackrest/test:u20-base-20210930A image (7ffb73ceb9a2e3aad2cba7eb5c8e28fc3982db18) ...", "P00 INFO: Building pgbackrest/test:u20-test image ...", "P00 INFO: Build Complete" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "pgbackrest/test/test.pl --vm=u20 --module=mock --test=archive --run=2" ], "cmd-extra" : "--no-log-timestamp", "host" : "pgbackrest-dev", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: test begin on x86_64 - log level info", "P00 INFO: autogenerate configure", "P00 INFO: autogenerated version in configure.ac script: no changes", "P00 INFO: autogenerated configure script: no changes", "P00 INFO: autogenerate code", "P00 INFO: cleanup old data and containers", "P00 INFO: builds required: bin, bin host", "P00 INFO: build bin for u20 (/home/vagrant/test/bin/u20)", "P00 INFO: bin dependencies have changed, rebuilding", "P00 INFO: build bin for none (/home/vagrant/test/bin/none)", "P00 INFO: bin dependencies have changed, rebuilding", "P00 INFO: 1 test selected", " ", "P00 INFO: P1-T1/1 - vm=u20, module=mock, test=archive, run=2", "P00 INFO: no code modules had all tests run required for coverage", "P00 INFO: TESTS COMPLETED SUCCESSFULLY" ] } } ], "user-guide" : [ { "key" : { "id" : "azure", "image" : "mcr.microsoft.com/azure-storage/azurite", "name" : "azure-server", "option" : "-m 128m -v {[host-repo-path]}/doc/resource/fake-cert/azure-server.crt:/root/public.crt:ro -v {[host-repo-path]}/doc/resource/fake-cert/azure-server.key:/root/private.key:ro -e AZURITE_ACCOUNTS='pgbackrest:YXpLZXk='", "os" : "debian", "param" : "azurite-blob --blobPort 443 --blobHost 0.0.0.0 --cert=/root/public.crt --key=/root/private.key", "update-hosts" : false }, "type" : "host", "value" : { "ip" : "172.17.0.2" } }, { "key" : { "id" : "s3", "image" : "minio/minio", "name" : "s3-server", "option" : "-m 128m -v {[host-repo-path]}/doc/resource/fake-cert/s3-server.crt:/root/.minio/certs/public.crt:ro -v {[host-repo-path]}/doc/resource/fake-cert/s3-server.key:/root/.minio/certs/private.key:ro -e MINIO_REGION=us-east-1 -e MINIO_DOMAIN=s3.us-east-1.amazonaws.com -e MINIO_BROWSER=off -e MINIO_ACCESS_KEY=accessKey1 -e MINIO_SECRET_KEY=verySecretKey1", "os" : "debian", "param" : "server /data --address :443", "update-hosts" : false }, "type" : "host", "value" : { "ip" : "172.17.0.3" } }, { "key" : { "id" : "build", "image" : "pgbackrest/doc:debian", "name" : "build", "option" : "-m 256m -v /sys/fs/cgroup:/sys/fs/cgroup:rw -v /tmp/$(mktemp -d):/run", "os" : "debian", "update-hosts" : true }, "type" : "host", "value" : { "ip" : "172.17.0.4" } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /build/pgbackrest-release-2.37" ], "host" : "build", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo cp -r /pgbackrest/src /build/pgbackrest-release-2.37" ], "host" : "build", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown -R vagrant /build/pgbackrest-release-2.37" ], "host" : "build", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo apt-get update" ], "host" : "build", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo apt-get install make gcc libpq-dev libssl-dev libxml2-dev pkg-config \\", " liblz4-dev libzstd-dev libbz2-dev libz-dev libyaml-dev" ], "cmd-extra" : "-y 2>&1", "host" : "build", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "cd /build/pgbackrest-release-2.37/src && ./configure && make" ], "cmd-extra" : "-j 4", "host" : "build", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "id" : "pg1", "image" : "pgbackrest/doc:debian", "name" : "pg-primary", "option" : "-m 512m -v /sys/fs/cgroup:/sys/fs/cgroup:rw -v /tmp/$(mktemp -d):/run", "os" : "debian", "update-hosts" : true }, "type" : "host", "value" : { "ip" : "172.17.0.5" } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo apt-get install postgresql-client libxml2" ], "cmd-extra" : "-y 2>&1", "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo scp build:/build/pgbackrest-release-2.37/src/pgbackrest /usr/bin" ], "cmd-extra" : "2>&1", "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 755 /usr/bin/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p -m 770 /var/log/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /var/log/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest/conf.d" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo touch /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 640 /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "pgBackRest 2.37 - General help", "", "Usage:", " pgbackrest [options] [command]", "", "Commands:", " archive-get Get a WAL segment from the archive.", " archive-push Push a WAL segment to the archive.", " backup Backup a database cluster.", " check Check the configuration.", " expire Expire backups that exceed retention.", " help Get help.", " info Retrieve information about backups.", " repo-get Get a file from a repository.", " repo-ls List files in a repository.", " restore Restore a database cluster.", " server pgBackRest server.", " server-ping Ping pgBackRest server.", " stanza-create Create the required stanza data.", " stanza-delete Delete a stanza.", " stanza-upgrade Upgrade a stanza.", " start Allow pgBackRest processes to run.", " stop Stop pgBackRest processes from running.", " version Get version.", "", "Use 'pgbackrest help [command]' for more information." ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres /usr/lib/postgresql/12/bin/initdb \\", " -D /var/lib/postgresql/12/demo -k -A peer" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_createcluster 12 demo" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "Configuring already existing cluster (configuration: /etc/postgresql/12/demo, data: /var/lib/postgresql/12/demo, owner: 102:103)", "Ver Cluster Port Status Owner Data directory Log file", "12 demo 5432 down postgres /var/lib/postgresql/12/demo /var/log/postgresql/postgresql-12-demo.log" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "echo 'shared_buffers = 16MB' >> /etc/postgresql/12/demo/postgresql.conf" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "file" : "/etc/postgresql/12/demo/postgresql.conf", "host" : "pg-primary", "option" : { "listen_addresses" : { "value" : "'*'" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "listen_addresses = '*'" ] } }, { "key" : { "file" : "/etc/postgresql/12/demo/postgresql.conf", "host" : "pg-primary", "option" : { "log_line_prefix" : { "value" : "''" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "listen_addresses = '*'", "log_line_prefix = ''" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/postgresql/12/demo" } }, "global" : { "log-level-stderr" : { "value" : "off" }, "log-timestamp" : { "value" : "n" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres bash -c ' \\", " export PGBACKREST_LOG_PATH=/path/set/by/env && \\", " pgbackrest --log-level-console=error help backup log-path'" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "current\\: \\/path\\/set\\/by\\/env" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "pgBackRest 2.37 - 'backup' command - 'log-path' option help", "", "Path where log files are stored.", "", "The log path provides a location for pgBackRest to store log files. Note that", "if log-level-file=off then no log path is required.", "", "current: /path/set/by/env", "default: /var/log/pgbackrest" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /var/lib/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 750 /var/lib/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /var/lib/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-path" : { "value" : "/var/lib/pgbackrest" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "repo1-path=/var/lib/pgbackrest" ] } }, { "key" : { "file" : "/etc/postgresql/12/demo/postgresql.conf", "host" : "pg-primary", "option" : { "archive_command" : { "value" : "'pgbackrest --stanza=demo archive-push %p'" }, "archive_mode" : { "value" : "on" }, "max_wal_senders" : { "value" : "3" }, "wal_level" : { "value" : "replica" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "archive_command = 'pgbackrest --stanza=demo archive-push %p'", "archive_mode = on", "listen_addresses = '*'", "log_line_prefix = ''", "max_wal_senders = 3", "wal_level = replica" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo restart" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global:archive-push" : { "compress-level" : { "value" : "3" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "repo1-path=/var/lib/pgbackrest", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-retention-full" : { "value" : "2" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-cipher-pass" : { "value" : "zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO" }, "repo1-cipher-type" : { "value" : "aes-256-cbc" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info stanza-create" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: stanza-create command begin 2.37: --exec-id=1293-896bc18e --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --stanza=demo", "P00 INFO: stanza-create for stanza 'demo' on repo1", "P00 INFO: stanza-create command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info check" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ " successfully archived to " ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: check command begin 2.37: --exec-id=1302-851f6789 --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --stanza=demo", "P00 INFO: check repo1 configuration (primary)", "P00 INFO: check repo1 archive for WAL (primary)", "P00 INFO: WAL segment 000000010000000000000001 successfully archived to '/var/lib/pgbackrest/archive/demo/12-1/0000000100000000/000000010000000000000001-573dbc22811c4f0055caa67a7cf20c9b1d4a5a35.gz' on repo1", "P00 INFO: check command end: completed successfully" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "start-fast" : { "value" : "y" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "no prior backup exists|full backup size" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: backup command begin 2.37: --exec-id=1330-32ed25fb --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-full=2 --stanza=demo --start-fast", "P00 WARN: no prior backup exists, incr backup has been changed to full", "P00 INFO: execute non-exclusive pg_start_backup(): backup begins after the requested immediate checkpoint completes", "P00 INFO: backup start archive = 000000010000000000000002, lsn = 0/2000028", " [filtered 3 lines of output]", "P00 INFO: check archive for segment(s) 000000010000000000000002:000000010000000000000003", "P00 INFO: new backup label = 20211231-195532F", "P00 INFO: full backup size = 23.4MB, file total = 976", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=1330-32ed25fb --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-full=2 --stanza=demo" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-195532F" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=diff \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "diff backup size" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 7 lines of output]", "P00 INFO: check archive for segment(s) 000000010000000000000004:000000010000000000000005", "P00 INFO: new backup label = 20211231-195532F_20211231-195538D", "P00 INFO: diff backup size = 8.3KB, file total = 976", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=1357-bbff90c0 --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-full=2 --stanza=demo" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest info" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "(full|incr|diff) backup" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "stanza: demo", " status: ok", " cipher: aes-256-cbc", "", " db (current)", " wal archive min/max (12): 000000010000000000000001/000000010000000000000005", "", " full backup: 20211231-195532F", " timestamp start/stop: 2021-12-31 19:55:32 / 2021-12-31 19:55:36", " wal start/stop: 000000010000000000000002 / 000000010000000000000003", " database size: 23.4MB, database backup size: 23.4MB", " repo1: backup set size: 2.8MB, backup size: 2.8MB", "", " diff backup: 20211231-195532F_20211231-195538D", " timestamp start/stop: 2021-12-31 19:55:38 / 2021-12-31 19:55:40", " wal start/stop: 000000010000000000000004 / 000000010000000000000005", " database size: 23.4MB, database backup size: 8.3KB", " repo1: backup set size: 2.8MB, backup size: 496B", " backup reference list: 20211231-195532F" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres rm /var/lib/postgresql/12/demo/global/pg_control" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo start" ], "err-expect" : "1", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "could not find the database system" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "Error: /usr/lib/postgresql/12/bin/pg_ctl /usr/lib/postgresql/12/bin/pg_ctl start -D /var/lib/postgresql/12/demo -l /var/log/postgresql/postgresql-12-demo.log -s -o -c config_file=\"/etc/postgresql/12/demo/postgresql.conf\" exited with status 1: ", "postgres: could not find the database system", "Expected to find it in the directory \"/var/lib/postgresql/12/demo\",", "but could not open file \"/var/lib/postgresql/12/demo/global/pg_control\": No such file or directory", "Examine the log output." ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres find /var/lib/postgresql/12/demo -mindepth 1 -delete" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres mkdir -p /var/lib/postgresql/pgbackrest/doc/example" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cp -r /pgbackrest/doc/example/* \\", " /var/lib/postgresql/pgbackrest/doc/example" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat \\", " /var/lib/postgresql/pgbackrest/doc/example/pgsql-pgbackrest-info.sql" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "-- An example of monitoring pgBackRest from within PostgreSQL", "--", "-- Use copy to export data from the pgBackRest info command into the jsonb", "-- type so it can be queried directly by PostgreSQL.", "", "-- Create monitor schema", "create schema monitor;", "", "-- Get pgBackRest info in JSON format", "create function monitor.pgbackrest_info()", " returns jsonb AS $$", "declare", " data jsonb;", "begin", " -- Create a temp table to hold the JSON data", " create temp table temp_pgbackrest_data (data jsonb);", "", " -- Copy data into the table directly from the pgBackRest info command", " copy temp_pgbackrest_data (data)", " from program", " 'pgbackrest --output=json info' (format text);", "", " select temp_pgbackrest_data.data", " into data", " from temp_pgbackrest_data;", "", " drop table temp_pgbackrest_data;", "", " return data;", "end $$ language plpgsql;" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -f \\", " /var/lib/postgresql/pgbackrest/doc/example/pgsql-pgbackrest-info.sql" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat \\", " /var/lib/postgresql/pgbackrest/doc/example/pgsql-pgbackrest-query.sql" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "-- Get last successful backup for each stanza", "--", "-- Requires the monitor.pgbackrest_info function.", "with stanza as", "(", " select data->'name' as name,", " data->'backup'->(", " jsonb_array_length(data->'backup') - 1) as last_backup,", " data->'archive'->(", " jsonb_array_length(data->'archive') - 1) as current_archive", " from jsonb_array_elements(monitor.pgbackrest_info()) as data", ")", "select name,", " to_timestamp(", " (last_backup->'timestamp'->>'stop')::numeric) as last_successful_backup,", " current_archive->>'max' as last_archived_wal", " from stanza;" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -f \\", " /var/lib/postgresql/pgbackrest/doc/example/pgsql-pgbackrest-query.sql" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " name | last_successful_backup | last_archived_wal ", "--------+------------------------+--------------------------", " \"demo\" | 2021-12-31 19:55:40+00 | 000000010000000000000005", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo apt-get install jq" ], "cmd-extra" : "-y 2>&1", "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --output=json --stanza=demo info | \\", " jq '.[0] | .backup[-1] | .timestamp.stop'" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "1640980540" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --output=json --stanza=demo info | \\", " jq '.[0] | .archive[-1] | .max'" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "\"000000010000000000000005\"" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-retention-full" : { "value" : "2" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=full \\", " --log-level-console=detail backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "archive retention on backup 20211231-195532F|remove archive" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 985 lines of output]", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=1625-0be930da --log-level-console=detail --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-full=2 --stanza=demo", "P00 DETAIL: repo1: 12-1 archive retention on backup 20211231-195532F, start = 000000010000000000000002", "P00 INFO: repo1: 12-1 remove archive, start = 000000010000000000000001, stop = 000000010000000000000001", "P00 INFO: expire command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-195604F" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=full \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "expire full backup set 20211231-195532F|archive retention on backup 20211231-195604F|remove archive" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 9 lines of output]", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=1652-b7ac48aa --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-full=2 --stanza=demo", "P00 INFO: repo1: expire full backup set 20211231-195532F, 20211231-195532F_20211231-195538D", "P00 INFO: repo1: remove expired backup 20211231-195532F_20211231-195538D", "P00 INFO: repo1: remove expired backup 20211231-195532F", "P00 INFO: repo1: 12-1 remove archive, start = 0000000100000000, stop = 000000020000000000000006", "P00 INFO: expire command end: completed successfully" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-retention-diff" : { "value" : "1" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-diff=1", "repo1-retention-full=2", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=diff backup" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-195609F_20211231-195615D" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=incr backup" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=diff \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "expire diff backup set 20211231-195609F_20211231-195615D" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 10 lines of output]", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=1729-c585e461 --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-diff=1 --repo1-retention-full=2 --stanza=demo", "P00 INFO: repo1: expire diff backup set 20211231-195609F_20211231-195615D, 20211231-195609F_20211231-195619I", "P00 INFO: repo1: remove expired backup 20211231-195609F_20211231-195619I", "P00 INFO: repo1: remove expired backup 20211231-195609F_20211231-195615D", "P00 INFO: expire command end: completed successfully" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-retention-diff" : { "value" : "2" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-diff=2", "repo1-retention-full=2", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-195609F_20211231-195621D" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " select pg_create_restore_point('generate WAL'); select pg_switch_wal(); \\", " select pg_create_restore_point('generate WAL'); select pg_switch_wal();\"" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=diff \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "new backup label" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 6 lines of output]", "P00 INFO: backup stop archive = 000000020000000000000013, lsn = 0/13000050", "P00 INFO: check archive for segment(s) 000000020000000000000012:000000020000000000000013", "P00 INFO: new backup label = 20211231-195609F_20211231-195627D", "P00 INFO: diff backup size = 8.3KB, file total = 976", "P00 INFO: backup command end: completed successfully", " [filtered 2 lines of output]" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-195609F_20211231-195627D" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=detail \\", " --repo1-retention-archive-type=diff --repo1-retention-archive=1 expire" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "archive retention on backup 20211231-195609F_20211231-195621D|remove archive" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: expire command begin 2.37: --exec-id=1812-3bf5f31c --log-level-console=detail --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-archive=1 --repo1-retention-archive-type=diff --repo1-retention-diff=2 --repo1-retention-full=2 --stanza=demo", "P00 DETAIL: repo1: 12-1 archive retention on backup 20211231-195604F, start = 000000020000000000000007, stop = 000000020000000000000007", "P00 DETAIL: repo1: 12-1 archive retention on backup 20211231-195609F, start = 000000020000000000000008, stop = 000000020000000000000009", "P00 DETAIL: repo1: 12-1 archive retention on backup 20211231-195609F_20211231-195621D, start = 00000002000000000000000E, stop = 00000002000000000000000F", "P00 DETAIL: repo1: 12-1 archive retention on backup 20211231-195609F_20211231-195627D, start = 000000020000000000000012", "P00 INFO: repo1: 12-1 remove archive, start = 00000002000000000000000A, stop = 00000002000000000000000D", "P00 INFO: repo1: 12-1 remove archive, start = 000000020000000000000010, stop = 000000020000000000000011", "P00 INFO: expire command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta \\", " --log-level-console=detail restore" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "demo\\/PG_VERSION - exists and matches backup|remove invalid files|rename global\\/pg_control" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 2 lines of output]", "P00 DETAIL: check '/var/lib/postgresql/12/demo' exists", "P00 DETAIL: remove 'global/pg_control' so cluster will not start if restore does not complete", "P00 INFO: remove invalid files/links/paths from '/var/lib/postgresql/12/demo'", "P00 DETAIL: remove invalid file '/var/lib/postgresql/12/demo/backup_label.old'", "P00 DETAIL: remove invalid file '/var/lib/postgresql/12/demo/base/1/pg_internal.init'", " [filtered 816 lines of output]", "P01 DETAIL: restore file /var/lib/postgresql/12/demo/base/13397/PG_VERSION - exists and matches backup (3B, 99%) checksum ad552e6dc057d1d825bf49df79d6b98eba846ebe", "P01 DETAIL: restore file /var/lib/postgresql/12/demo/base/1/PG_VERSION - exists and matches backup (3B, 99%) checksum ad552e6dc057d1d825bf49df79d6b98eba846ebe", "P01 DETAIL: restore file /var/lib/postgresql/12/demo/PG_VERSION - exists and matches backup (3B, 100%) checksum ad552e6dc057d1d825bf49df79d6b98eba846ebe", "P01 DETAIL: restore file /var/lib/postgresql/12/demo/global/6100 - exists and is zero size (0B, 100%)", "P01 DETAIL: restore file /var/lib/postgresql/12/demo/global/6000 - exists and is zero size (0B, 100%)", " [filtered 202 lines of output]" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"create database test1;\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "CREATE DATABASE" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"create database test2;\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "CREATE DATABASE" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"create table test1_table (id int); \\", " insert into test1_table (id) values (1);\" test1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "INSERT 0 1" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"create table test2_table (id int); \\", " insert into test2_table (id) values (2);\" test2" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "INSERT 0 1" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=incr backup" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -Atc \"select oid from pg_database where datname = 'test1'\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "24576" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres du -sh /var/lib/postgresql/12/demo/base/24576" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "7.8M\t/var/lib/postgresql/12/demo/base/24576" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-195609F_20211231-195642I" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo \\", " --set=20211231-195609F_20211231-195642I info" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "database list" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 11 lines of output]", " repo1: backup set size: 4.7MB, backup size: 1.9MB", " backup reference list: 20211231-195609F, 20211231-195609F_20211231-195627D", " database list: postgres (13398), test1 (24576), test2 (24577)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta \\", " --db-include=test2 --type=immediate --target-action=promote restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from test2_table;\" test2" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " id ", "----", " 2", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from test1_table;\" test1" ], "err-expect" : "2", "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "relation mapping file.*contains invalid data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "psql: error: connection to server on socket \"/var/run/postgresql/.s.PGSQL.5432\" failed: FATAL: relation mapping file \"base/24576/pg_filenode.map\" contains invalid data" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres du -sh /var/lib/postgresql/12/demo/base/24576" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "16K\t/var/lib/postgresql/12/demo/base/24576" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"drop database test1;\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "DROP DATABASE" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select oid, datname from pg_database order by oid;\"" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "test2" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " oid | datname ", "-------+-----------", " 1 | template1", " 13397 | template0", " 13398 | postgres", " 24577 | test2", "(4 rows)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=diff backup" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"begin; \\", " create table important_table (message text); \\", " insert into important_table values ('Important Data'); \\", " commit; \\", " select * from important_table;\"" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message ", "----------------", " Important Data", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 1" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -Atc \"select current_timestamp\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "2021-12-31 19:57:03.172377+00" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 1" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"begin; \\", " drop table important_table; \\", " commit; \\", " select * from important_table;\"" ], "err-expect" : "1", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "does not exist" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ERROR: relation \"important_table\" does not exist", "LINE 1: ...le important_table; commit; select * from important_...", " ^" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta \\", " --type=time \"--target=2021-12-31 19:57:03.172377+00\" \\", " --target-action=promote restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm /var/log/postgresql/postgresql-12-demo.log" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/lib/postgresql/12/demo/postgresql.auto.conf" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "recovery_target_time" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 14 lines of output]", "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:57:07", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "recovery_target_time = '2021-12-31 19:57:03.172377+00'", "recovery_target_action = 'promote'" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from important_table\"" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message ", "----------------", " Important Data", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/log/postgresql/postgresql-12-demo.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "recovery stopping before|last completed transaction|starting point-in-time recovery" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 3 lines of output]", "LOG: listening on Unix socket \"/var/run/postgresql/.s.PGSQL.5432\"", "LOG: database system was interrupted; last known up at 2021-12-31 19:56:58 UTC", "LOG: starting point-in-time recovery to 2021-12-31 19:57:03.172377+00", "LOG: restored log file \"00000004.history\" from archive", "LOG: restored log file \"000000040000000000000016\" from archive", " [filtered 2 lines of output]", "LOG: database system is ready to accept read only connections", "LOG: restored log file \"000000040000000000000017\" from archive", "LOG: recovery stopping before commit of transaction 495, time 2021-12-31 19:57:05.396152+00", "LOG: redo done at 0/17019E10", "LOG: last completed transaction was at log time 2021-12-31 19:57:00.903751+00", "LOG: selected new timeline ID: 5", "LOG: archive recovery complete", " [filtered 2 lines of output]" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"begin; \\", " drop table important_table; \\", " commit; \\", " select * from important_table;\"" ], "err-expect" : "1", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "does not exist" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ERROR: relation \"important_table\" does not exist", "LINE 1: ...le important_table; commit; select * from important_...", " ^" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=incr backup" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-195609F_20211231-195716I" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest info" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "20211231-195609F_20211231-195716I" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "stanza: demo", " status: ok", " cipher: aes-256-cbc", "", " db (current)", " wal archive min/max (12): 000000020000000000000007/000000050000000000000018", "", " full backup: 20211231-195604F", " timestamp start/stop: 2021-12-31 19:56:04 / 2021-12-31 19:56:07", " wal start/stop: 000000020000000000000007 / 000000020000000000000007", " database size: 23.4MB, database backup size: 23.4MB", " repo1: backup set size: 2.8MB, backup size: 2.8MB", "", " full backup: 20211231-195609F", " timestamp start/stop: 2021-12-31 19:56:09 / 2021-12-31 19:56:13", " wal start/stop: 000000020000000000000008 / 000000020000000000000009", " database size: 23.4MB, database backup size: 23.4MB", " repo1: backup set size: 2.8MB, backup size: 2.8MB", "", " diff backup: 20211231-195609F_20211231-195627D", " timestamp start/stop: 2021-12-31 19:56:27 / 2021-12-31 19:56:29", " wal start/stop: 000000020000000000000012 / 000000020000000000000013", " database size: 23.4MB, database backup size: 8.3KB", " repo1: backup set size: 2.8MB, backup size: 512B", " backup reference list: 20211231-195609F", "", " incr backup: 20211231-195609F_20211231-195642I", " timestamp start/stop: 2021-12-31 19:56:42 / 2021-12-31 19:56:44", " wal start/stop: 000000030000000000000015 / 000000030000000000000015", " database size: 38.7MB, database backup size: 15.8MB", " repo1: backup set size: 4.7MB, backup size: 1.9MB", " backup reference list: 20211231-195609F, 20211231-195609F_20211231-195627D", "", " diff backup: 20211231-195609F_20211231-195658D", " timestamp start/stop: 2021-12-31 19:56:58 / 2021-12-31 19:57:00", " wal start/stop: 000000040000000000000016 / 000000040000000000000016", " database size: 31MB, database backup size: 8.2MB", " repo1: backup set size: 3.8MB, backup size: 1011.1KB", " backup reference list: 20211231-195609F", "", " incr backup: 20211231-195609F_20211231-195716I", " timestamp start/stop: 2021-12-31 19:57:16 / 2021-12-31 19:57:18", " wal start/stop: 000000050000000000000018 / 000000050000000000000018", " database size: 31MB, database backup size: 2.2MB", " repo1: backup set size: 3.8MB, backup size: 234KB", " backup reference list: 20211231-195609F, 20211231-195609F_20211231-195658D" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta \\", " --set=20211231-195609F_20211231-195716I \\", " --type=time \"--target=2021-12-31 19:57:03.172377+00\" --target-action=promote restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm /var/log/postgresql/postgresql-12-demo.log" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from important_table\"" ], "err-expect" : "1", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "does not exist" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ERROR: relation \"important_table\" does not exist", "LINE 1: select * from important_table", " ^" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/log/postgresql/postgresql-12-demo.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "starting point-in-time recovery|consistent recovery state reached" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 3 lines of output]", "LOG: listening on Unix socket \"/var/run/postgresql/.s.PGSQL.5432\"", "LOG: database system was interrupted; last known up at 2021-12-31 19:57:17 UTC", "LOG: starting point-in-time recovery to 2021-12-31 19:57:03.172377+00", "LOG: restored log file \"00000005.history\" from archive", "LOG: restored log file \"000000050000000000000018\" from archive", "LOG: redo starts at 0/18000028", "LOG: consistent recovery state reached at 0/18000100", "LOG: database system is ready to accept read only connections", "LOG: redo done at 0/18000100", " [filtered 7 lines of output]" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta \\", " --type=time \"--target=2021-12-31 19:57:03.172377+00\" \\", " --target-action=promote restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm /var/log/postgresql/postgresql-12-demo.log" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from important_table\"" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message ", "----------------", " Important Data", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/log/postgresql/postgresql-12-demo.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "recovery stopping before|last completed transaction|starting point-in-time recovery" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 5 lines of output]", "LOG: restored log file \"00000005.history\" from archive", "LOG: restored log file \"00000006.history\" from archive", "LOG: starting point-in-time recovery to 2021-12-31 19:57:03.172377+00", "LOG: restored log file \"00000006.history\" from archive", "LOG: restored log file \"000000040000000000000016\" from archive", " [filtered 4 lines of output]", "LOG: database system is ready to accept read only connections", "LOG: restored log file \"000000050000000000000017\" from archive", "LOG: recovery stopping before commit of transaction 496, time 2021-12-31 19:57:15.721973+00", "LOG: redo done at 0/17022440", "LOG: last completed transaction was at log time 2021-12-31 19:57:00.903751+00", "LOG: selected new timeline ID: 7", "LOG: archive recovery complete", " [filtered 2 lines of output]" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "process-max" : { "value" : "4" }, "repo2-azure-account" : { "value" : "pgbackrest" }, "repo2-azure-container" : { "value" : "demo-container" }, "repo2-azure-key" : { "value" : "YXpLZXk=" }, "repo2-path" : { "value" : "/demo-repo" }, "repo2-retention-full" : { "value" : "4" }, "repo2-type" : { "value" : "azure" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "process-max=4", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-diff=2", "repo1-retention-full=2", "repo2-azure-account=pgbackrest", "repo2-azure-container=demo-container", "repo2-azure-key=YXpLZXk=", "repo2-path=/demo-repo", "repo2-retention-full=4", "repo2-type=azure", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "echo \"172.17.0.2 pgbackrest.blob.core.windows.net\" | tee -a /etc/hosts" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --repo=2 repo-create" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info stanza-create" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 2 lines of output]", "P00 INFO: stanza 'demo' already exists on repo1 and is valid", "P00 INFO: stanza-create for stanza 'demo' on repo2", "P00 INFO: stanza-create command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --repo=2 \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "no prior backup exists|full backup size" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: backup command begin 2.37: --exec-id=2571-9dfe952d --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --process-max=4 --repo=2 --repo2-azure-account= --repo2-azure-container=demo-container --repo2-azure-key= --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo2-path=/demo-repo --repo1-retention-diff=2 --repo1-retention-full=2 --repo2-retention-full=4 --repo2-type=azure --stanza=demo --start-fast", "P00 WARN: no prior backup exists, incr backup has been changed to full", "P00 INFO: execute non-exclusive pg_start_backup(): backup begins after the requested immediate checkpoint completes", "P00 INFO: backup start archive = 000000070000000000000018, lsn = 0/18000028", " [filtered 3 lines of output]", "P00 INFO: check archive for segment(s) 000000070000000000000018:000000070000000000000018", "P00 INFO: new backup label = 20211231-195743F", "P00 INFO: full backup size = 31MB, file total = 1282", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=2571-9dfe952d --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo=2 --repo2-azure-account= --repo2-azure-container=demo-container --repo2-azure-key= --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo2-path=/demo-repo --repo1-retention-diff=2 --repo1-retention-full=2 --repo2-retention-full=4 --repo2-type=azure --stanza=demo" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "process-max" : { "value" : "4" }, "repo3-path" : { "value" : "/demo-repo" }, "repo3-retention-full" : { "value" : "4" }, "repo3-s3-bucket" : { "value" : "demo-bucket" }, "repo3-s3-endpoint" : { "value" : "s3.us-east-1.amazonaws.com" }, "repo3-s3-key" : { "value" : "accessKey1" }, "repo3-s3-key-secret" : { "value" : "verySecretKey1" }, "repo3-s3-region" : { "value" : "us-east-1" }, "repo3-type" : { "value" : "s3" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "process-max=4", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-diff=2", "repo1-retention-full=2", "repo2-azure-account=pgbackrest", "repo2-azure-container=demo-container", "repo2-azure-key=YXpLZXk=", "repo2-path=/demo-repo", "repo2-retention-full=4", "repo2-type=azure", "repo3-path=/demo-repo", "repo3-retention-full=4", "repo3-s3-bucket=demo-bucket", "repo3-s3-endpoint=s3.us-east-1.amazonaws.com", "repo3-s3-key=accessKey1", "repo3-s3-key-secret=verySecretKey1", "repo3-s3-region=us-east-1", "repo3-type=s3", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "echo \"172.17.0.3 demo-bucket.s3.us-east-1.amazonaws.com s3.us-east-1.amazonaws.com\" | tee -a /etc/hosts" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --repo=3 repo-create" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info stanza-create" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 4 lines of output]", "P00 INFO: stanza 'demo' already exists on repo2 and is valid", "P00 INFO: stanza-create for stanza 'demo' on repo3", "P00 INFO: stanza-create command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --repo=3 \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "no prior backup exists|full backup size" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: backup command begin 2.37: --exec-id=2636-5e500c5a --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --process-max=4 --repo=3 --repo2-azure-account= --repo2-azure-container=demo-container --repo2-azure-key= --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo2-path=/demo-repo --repo3-path=/demo-repo --repo1-retention-diff=2 --repo1-retention-full=2 --repo2-retention-full=4 --repo3-retention-full=4 --repo3-s3-bucket=demo-bucket --repo3-s3-endpoint=s3.us-east-1.amazonaws.com --repo3-s3-key= --repo3-s3-key-secret= --repo3-s3-region=us-east-1 --repo2-type=azure --repo3-type=s3 --stanza=demo --start-fast", "P00 WARN: no prior backup exists, incr backup has been changed to full", "P00 INFO: execute non-exclusive pg_start_backup(): backup begins after the requested immediate checkpoint completes", "P00 INFO: backup start archive = 000000070000000000000019, lsn = 0/19000028", " [filtered 3 lines of output]", "P00 INFO: check archive for segment(s) 000000070000000000000019:00000007000000000000001A", "P00 INFO: new backup label = 20211231-195800F", "P00 INFO: full backup size = 31MB, file total = 1282", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=2636-5e500c5a --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo=3 --repo2-azure-account= --repo2-azure-container=demo-container --repo2-azure-key= --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo2-path=/demo-repo --repo3-path=/demo-repo --repo1-retention-diff=2 --repo1-retention-full=2 --repo2-retention-full=4 --repo3-retention-full=4 --repo3-s3-bucket=demo-bucket --repo3-s3-endpoint=s3.us-east-1.amazonaws.com --repo3-s3-key= --repo3-s3-key-secret= --repo3-s3-region=us-east-1 --repo2-type=azure --repo3-type=s3 --stanza=demo" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "process-max" : { "value" : "4" }, "repo4-gcs-bucket" : { "value" : "demo-bucket" }, "repo4-gcs-key" : { "value" : "/etc/pgbackrest/gcs-key.json" }, "repo4-path" : { "value" : "/demo-repo" }, "repo4-type" : { "value" : "gcs" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "process-max=4", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-diff=2", "repo1-retention-full=2", "repo2-azure-account=pgbackrest", "repo2-azure-container=demo-container", "repo2-azure-key=YXpLZXk=", "repo2-path=/demo-repo", "repo2-retention-full=4", "repo2-type=azure", "repo3-path=/demo-repo", "repo3-retention-full=4", "repo3-s3-bucket=demo-bucket", "repo3-s3-endpoint=s3.us-east-1.amazonaws.com", "repo3-s3-key=accessKey1", "repo3-s3-key-secret=verySecretKey1", "repo3-s3-region=us-east-1", "repo3-type=s3", "repo4-gcs-bucket=demo-bucket", "repo4-gcs-key=/etc/pgbackrest/gcs-key.json", "repo4-path=/demo-repo", "repo4-type=gcs", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info stop" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: stop command begin 2.37: --exec-id=2684-91d01886 --log-level-console=info --log-level-stderr=off --no-log-timestamp --stanza=demo", "P00 INFO: stop command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --repo=1 \\", " --log-level-console=info stanza-delete" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: stanza-delete command begin 2.37: --exec-id=2692-aa1d1fa5 --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --repo=1 --repo2-azure-account= --repo2-azure-container=demo-container --repo2-azure-key= --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo4-gcs-bucket=demo-bucket --repo4-gcs-key= --repo1-path=/var/lib/pgbackrest --repo2-path=/demo-repo --repo3-path=/demo-repo --repo4-path=/demo-repo --repo3-s3-bucket=demo-bucket --repo3-s3-endpoint=s3.us-east-1.amazonaws.com --repo3-s3-key= --repo3-s3-key-secret= --repo3-s3-region=us-east-1 --repo2-type=azure --repo3-type=s3 --repo4-type=gcs --stanza=demo", "P00 INFO: stanza-delete command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "id" : "repo1", "image" : "pgbackrest/doc:debian", "name" : "repository", "option" : "-m 512m -v /sys/fs/cgroup:/sys/fs/cgroup:rw -v /tmp/$(mktemp -d):/run", "os" : "debian", "update-hosts" : true }, "type" : "host", "value" : { "ip" : "172.17.0.6" } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo adduser --disabled-password --gecos \"\" pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo apt-get install postgresql-client libxml2" ], "cmd-extra" : "-y 2>&1", "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo scp build:/build/pgbackrest-release-2.37/src/pgbackrest /usr/bin" ], "cmd-extra" : "2>&1", "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 755 /usr/bin/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p -m 770 /var/log/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown pgbackrest:pgbackrest /var/log/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest/conf.d" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo touch /etc/pgbackrest/pgbackrest.conf" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 640 /etc/pgbackrest/pgbackrest.conf" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown pgbackrest:pgbackrest /etc/pgbackrest/pgbackrest.conf" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /var/lib/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 750 /var/lib/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown pgbackrest:pgbackrest /var/lib/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest mkdir -m 750 /home/pgbackrest/.ssh" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest ssh-keygen -f /home/pgbackrest/.ssh/id_rsa \\", " -t rsa -b 4096 -N \"\"" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres mkdir -m 750 -p /var/lib/postgresql/.ssh" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres ssh-keygen -f /var/lib/postgresql/.ssh/id_rsa \\", " -t rsa -b 4096 -N \"\"" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "(echo -n 'no-agent-forwarding,no-X11-forwarding,no-port-forwarding,' && \\", " echo -n 'command=\"/usr/bin/pgbackrest ${SSH_ORIGINAL_COMMAND#* }\" ' && \\", " sudo ssh root@pg-primary cat /var/lib/postgresql/.ssh/id_rsa.pub) | \\", " sudo -u pgbackrest tee -a /home/pgbackrest/.ssh/authorized_keys" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "(echo -n 'no-agent-forwarding,no-X11-forwarding,no-port-forwarding,' && \\", " echo -n 'command=\"/usr/bin/pgbackrest ${SSH_ORIGINAL_COMMAND#* }\" ' && \\", " sudo ssh root@repository cat /home/pgbackrest/.ssh/id_rsa.pub) | \\", " sudo -u postgres tee -a /var/lib/postgresql/.ssh/authorized_keys" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest ssh postgres@pg-primary" ], "cmd-extra" : "-o StrictHostKeyChecking=no", "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres ssh pgbackrest@repository" ], "cmd-extra" : "-o StrictHostKeyChecking=no", "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "global" : { "repo1-path" : { "value" : "/var/lib/pgbackrest" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[global]", "repo1-path=/var/lib/pgbackrest" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "demo" : { "pg1-host" : { "value" : "pg-primary" }, "pg1-path" : { "value" : "/var/lib/postgresql/12/demo" } }, "global" : { "log-level-stderr" : { "value" : "off" }, "log-timestamp" : { "value" : "n" }, "repo1-retention-full" : { "value" : "2" }, "start-fast" : { "value" : "y" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-host=pg-primary", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/postgresql/12/demo" } }, "global" : { "log-level-file" : { "value" : "detail" }, "log-level-stderr" : { "value" : "off" }, "log-timestamp" : { "value" : "n" }, "repo1-host" : { "value" : "repository" } } }, "reset" : true }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "log-level-file=detail", "repo1-host=repository" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo stanza-create" ], "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo check" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo check" ], "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo backup" ], "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 WARN: no prior backup exists, incr backup has been changed to full" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo --type=full backup" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "global" : { "process-max" : { "value" : "3" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-host=pg-primary", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "process-max=3", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo --type=full backup" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest info" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "timestamp start/stop" ] }, "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "stanza: demo", " status: ok", " cipher: none", "", " db (current)", " wal archive min/max (12): 000000080000000000000020/000000080000000000000022", "", " full backup: 20211231-195910F", " timestamp start/stop: 2021-12-31 19:59:10 / 2021-12-31 19:59:15", " wal start/stop: 000000080000000000000020 / 000000080000000000000020", " database size: 31MB, database backup size: 31MB", " repo1: backup set size: 3.7MB, backup size: 3.7MB", "", " full backup: 20211231-195919F", " timestamp start/stop: 2021-12-31 19:59:19 / 2021-12-31 19:59:25", " wal start/stop: 000000080000000000000022 / 000000080000000000000022", " database size: 31MB, database backup size: 31MB", " repo1: backup set size: 3.7MB, backup size: 3.7MB" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo backup" ], "err-expect" : "56", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "\\: stop file exists for all stanzas" ] }, "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 WARN: unable to check pg-1: [StopError] raised from remote-0 ssh protocol on 'pg-primary': stop file exists for all stanzas", "P00 ERROR: [056]: unable to find primary cluster - cannot proceed" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest stop" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 WARN: stop file already exists for all stanzas" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo backup" ], "err-expect" : "56", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "\\: stop file exists for stanza demo" ] }, "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 WARN: unable to check pg-1: [StopError] raised from remote-0 ssh protocol on 'pg-primary': stop file exists for stanza demo", "P00 ERROR: [056]: unable to find primary cluster - cannot proceed" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "id" : "pg2", "image" : "pgbackrest/doc:debian", "name" : "pg-standby", "option" : "-m 512m -v /sys/fs/cgroup:/sys/fs/cgroup:rw -v /tmp/$(mktemp -d):/run", "os" : "debian", "update-hosts" : true }, "type" : "host", "value" : { "ip" : "172.17.0.7" } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo apt-get install postgresql-client libxml2" ], "cmd-extra" : "-y 2>&1", "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo scp build:/build/pgbackrest-release-2.37/src/pgbackrest /usr/bin" ], "cmd-extra" : "2>&1", "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 755 /usr/bin/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p -m 770 /var/log/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /var/log/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest/conf.d" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo touch /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 640 /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres mkdir -m 750 -p /var/lib/postgresql/.ssh" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres ssh-keygen -f /var/lib/postgresql/.ssh/id_rsa \\", " -t rsa -b 4096 -N \"\"" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "(echo -n 'no-agent-forwarding,no-X11-forwarding,no-port-forwarding,' && \\", " echo -n 'command=\"/usr/bin/pgbackrest ${SSH_ORIGINAL_COMMAND#* }\" ' && \\", " sudo ssh root@pg-standby cat /var/lib/postgresql/.ssh/id_rsa.pub) | \\", " sudo -u pgbackrest tee -a /home/pgbackrest/.ssh/authorized_keys" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "(echo -n 'no-agent-forwarding,no-X11-forwarding,no-port-forwarding,' && \\", " echo -n 'command=\"/usr/bin/pgbackrest ${SSH_ORIGINAL_COMMAND#* }\" ' && \\", " sudo ssh root@repository cat /home/pgbackrest/.ssh/id_rsa.pub) | \\", " sudo -u postgres tee -a /var/lib/postgresql/.ssh/authorized_keys" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest ssh postgres@pg-standby" ], "cmd-extra" : "-o StrictHostKeyChecking=no", "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres ssh pgbackrest@repository" ], "cmd-extra" : "-o StrictHostKeyChecking=no", "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-standby", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/postgresql/12/demo" } }, "global" : { "log-level-file" : { "value" : "detail" }, "log-level-stderr" : { "value" : "off" }, "log-timestamp" : { "value" : "n" }, "repo1-host" : { "value" : "repository" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "log-level-file=detail", "repo1-host=repository" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_createcluster 12 demo" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta --type=standby restore" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/lib/postgresql/12/demo/postgresql.auto.conf" ], "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "# Do not edit this file manually!", "# It will be overwritten by the ALTER SYSTEM command.", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:55:45", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:56:32", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:56:49", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "# Removed by pgBackRest restore on 2021-12-31 19:57:31 # recovery_target = 'immediate'", "# Removed by pgBackRest restore on 2021-12-31 19:57:31 # recovery_target_action = 'promote'", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:57:31", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "# Removed by pgBackRest restore on 2021-12-31 19:59:02 # recovery_target_time = '2021-12-31 19:57:03.172377+00'", "# Removed by pgBackRest restore on 2021-12-31 19:59:02 # recovery_target_action = 'promote'", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:59:02", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 20:00:00", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "echo 'shared_buffers = 16MB' >> /etc/postgresql/12/demo/postgresql.conf" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "file" : "/etc/postgresql/12/demo/postgresql.conf", "host" : "pg-standby", "option" : { "archive_command" : { "value" : "'pgbackrest --stanza=demo archive-push %p'" }, "archive_mode" : { "value" : "on" }, "hot_standby" : { "value" : "on" }, "log_filename" : { "value" : "'postgresql.log'" }, "log_line_prefix" : { "value" : "''" }, "max_wal_senders" : { "value" : "3" }, "wal_level" : { "value" : "replica" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "archive_command = 'pgbackrest --stanza=demo archive-push %p'", "archive_mode = on", "hot_standby = on", "log_filename = 'postgresql.log'", "log_line_prefix = ''", "max_wal_senders = 3", "wal_level = replica" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm /var/log/postgresql/postgresql-12-demo.log" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo start" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/log/postgresql/postgresql-12-demo.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "entering standby mode|database system is ready to accept read only connections" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 4 lines of output]", "LOG: listening on Unix socket \"/var/run/postgresql/.s.PGSQL.5432\"", "LOG: database system was interrupted; last known up at 2021-12-31 19:59:19 UTC", "LOG: entering standby mode", "LOG: restored log file \"00000008.history\" from archive", "LOG: restored log file \"000000080000000000000022\" from archive", "LOG: redo starts at 0/22000028", "LOG: consistent recovery state reached at 0/22000100", "LOG: database system is ready to accept read only connections" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " begin; \\", " create table replicated_table (message text); \\", " insert into replicated_table values ('Important Data'); \\", " commit; \\", " select * from replicated_table\";" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message ", "----------------", " Important Data", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from replicated_table;\"" ], "err-expect" : "1", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "does not exist" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ERROR: relation \"replicated_table\" does not exist", "LINE 1: select * from replicated_table;", " ^" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select *, current_timestamp from pg_switch_wal()\";" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " pg_switch_wal | current_timestamp ", "---------------+-------------------------------", " 0/23021750 | 2021-12-31 20:00:14.253018+00", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " select *, current_timestamp from replicated_table\"" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message | current_timestamp ", "----------------+-------------------------------", " Important Data | 2021-12-31 20:00:17.565198+00", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info check" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "because this is a standby" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: check command begin 2.37: --exec-id=1338-239cb6ec --log-level-console=info --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --repo1-host=repository --stanza=demo", "P00 INFO: check repo1 (standby)", "P00 INFO: switch wal not performed because this is a standby", "P00 INFO: check command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " create user replicator password 'jw8s0F4' replication\";" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "CREATE ROLE" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sh -c 'echo \\", " \"host replication replicator 172.17.0.7/32 md5\" \\", " >> /etc/postgresql/12/demo/pg_hba.conf'" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo reload" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-standby", "option" : { "demo" : { "recovery-option" : { "value" : "primary_conninfo=host=172.17.0.5 port=5432 user=replicator" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "recovery-option=primary_conninfo=host=172.17.0.5 port=5432 user=replicator", "", "[global]", "log-level-file=detail", "repo1-host=repository" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sh -c 'echo \\", " \"172.17.0.5:*:replication:replicator:jw8s0F4\" \\", " >> /var/lib/postgresql/.pgpass'" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres chmod 600 /var/lib/postgresql/.pgpass" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo stop" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta --type=standby restore" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/lib/postgresql/12/demo/postgresql.auto.conf" ], "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "# Do not edit this file manually!", "# It will be overwritten by the ALTER SYSTEM command.", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:55:45", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:56:32", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:56:49", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "# Removed by pgBackRest restore on 2021-12-31 19:57:31 # recovery_target = 'immediate'", "# Removed by pgBackRest restore on 2021-12-31 19:57:31 # recovery_target_action = 'promote'", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:57:31", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "# Removed by pgBackRest restore on 2021-12-31 19:59:02 # recovery_target_time = '2021-12-31 19:57:03.172377+00'", "# Removed by pgBackRest restore on 2021-12-31 19:59:02 # recovery_target_action = 'promote'", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:59:02", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "", "# Recovery settings generated by pgBackRest restore on 2021-12-31 20:00:26", "primary_conninfo = 'host=172.17.0.5 port=5432 user=replicator'", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm /var/log/postgresql/postgresql-12-demo.log" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo start" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/log/postgresql/postgresql-12-demo.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "started streaming WAL from primary" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 11 lines of output]", "LOG: database system is ready to accept read only connections", "LOG: restored log file \"000000080000000000000023\" from archive", "LOG: started streaming WAL from primary at 0/24000000 on timeline 8" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " begin; \\", " create table stream_table (message text); \\", " insert into stream_table values ('Important Data'); \\", " commit; \\", " select *, current_timestamp from stream_table\";" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message | current_timestamp ", "----------------+-------------------------------", " Important Data | 2021-12-31 20:00:34.575889+00", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " select *, current_timestamp from stream_table\"" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message | current_timestamp ", "----------------+-------------------------------", " Important Data | 2021-12-31 20:00:35.482415+00", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p -m 750 /var/spool/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /var/spool/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p -m 750 /var/spool/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /var/spool/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "archive-async" : { "value" : "y" }, "spool-path" : { "value" : "/var/spool/pgbackrest" } }, "global:archive-get" : { "process-max" : { "value" : "2" } }, "global:archive-push" : { "process-max" : { "value" : "2" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "", "[global]", "archive-async=y", "log-level-file=detail", "repo1-host=repository", "spool-path=/var/spool/pgbackrest", "", "[global:archive-get]", "process-max=2", "", "[global:archive-push]", "process-max=2" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-standby", "option" : { "global" : { "archive-async" : { "value" : "y" }, "spool-path" : { "value" : "/var/spool/pgbackrest" } }, "global:archive-get" : { "process-max" : { "value" : "2" } }, "global:archive-push" : { "process-max" : { "value" : "2" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/12/demo", "recovery-option=primary_conninfo=host=172.17.0.5 port=5432 user=replicator", "", "[global]", "archive-async=y", "log-level-file=detail", "repo1-host=repository", "spool-path=/var/spool/pgbackrest", "", "[global:archive-get]", "process-max=2", "", "[global:archive-push]", "process-max=2" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"alter user replicator password 'bogus'\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ALTER ROLE" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo restart" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres rm -f /var/log/pgbackrest/demo-archive-push-async.log" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " select pg_create_restore_point('test async push'); select pg_switch_wal(); \\", " select pg_create_restore_point('test async push'); select pg_switch_wal(); \\", " select pg_create_restore_point('test async push'); select pg_switch_wal(); \\", " select pg_create_restore_point('test async push'); select pg_switch_wal(); \\", " select pg_create_restore_point('test async push'); select pg_switch_wal();\"" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info check" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "WAL segment" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: check command begin 2.37: --exec-id=3279-1d1209e9 --log-level-console=info --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --repo1-host=repository --stanza=demo", "P00 INFO: check repo1 configuration (primary)", "P00 INFO: check repo1 archive for WAL (primary)", "P00 INFO: WAL segment 000000080000000000000029 successfully archived to '/var/lib/pgbackrest/archive/demo/12-1/0000000800000000/000000080000000000000029-45c70258c3af934d29956e009b6f211e949dcb1a.gz' on repo1", "P00 INFO: check command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/log/pgbackrest/demo-archive-push-async.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ " WAL file\\(s\\) to archive|pushed WAL file \\'0000000" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "-------------------PROCESS START-------------------", "P00 INFO: archive-push:async command begin 2.37: [/var/lib/postgresql/12/demo/pg_wal] --archive-async --exec-id=3265-085154e0 --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --process-max=2 --repo1-host=repository --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: push 1 WAL file(s) to archive: 000000080000000000000024", "P01 DETAIL: pushed WAL file '000000080000000000000024' to the archive", "P00 INFO: archive-push:async command end: completed successfully", "", "-------------------PROCESS START-------------------", "P00 INFO: archive-push:async command begin 2.37: [/var/lib/postgresql/12/demo/pg_wal] --archive-async --exec-id=3283-c2b7f5b4 --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --process-max=2 --repo1-host=repository --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: push 4 WAL file(s) to archive: 000000080000000000000025...000000080000000000000028", "P02 DETAIL: pushed WAL file '000000080000000000000026' to the archive", "P01 DETAIL: pushed WAL file '000000080000000000000025' to the archive", "P02 DETAIL: pushed WAL file '000000080000000000000027' to the archive", "P01 DETAIL: pushed WAL file '000000080000000000000028' to the archive", "P00 INFO: archive-push:async command end: completed successfully", "", "-------------------PROCESS START-------------------", "P00 INFO: archive-push:async command begin 2.37: [/var/lib/postgresql/12/demo/pg_wal] --archive-async --exec-id=3299-b52ab60e --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --process-max=2 --repo1-host=repository --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: push 1 WAL file(s) to archive: 000000080000000000000029", "P01 DETAIL: pushed WAL file '000000080000000000000029' to the archive", "P00 INFO: archive-push:async command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 5" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/log/pgbackrest/demo-archive-get-async.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "found [0-F]{24} in the .* archive" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "-------------------PROCESS START-------------------", "P00 INFO: archive-get:async command begin 2.37: [000000080000000000000022, 000000080000000000000023, 000000080000000000000024, 000000080000000000000025, 000000080000000000000026, 000000080000000000000027, 000000080000000000000028, 000000080000000000000029] --archive-async --exec-id=1548-7fe4e456 --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --process-max=2 --repo1-host=repository --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: get 8 WAL file(s) from archive: 000000080000000000000022...000000080000000000000029", "P01 DETAIL: found 000000080000000000000022 in the repo1: 12-1 archive", "P02 DETAIL: found 000000080000000000000023 in the repo1: 12-1 archive", "P00 DETAIL: unable to find 000000080000000000000024 in the archive", "P00 INFO: archive-get:async command end: completed successfully", " [filtered 14 lines of output]", "P00 INFO: archive-get:async command begin 2.37: [000000080000000000000024, 000000080000000000000025, 000000080000000000000026, 000000080000000000000027, 000000080000000000000028, 000000080000000000000029, 00000008000000000000002A, 00000008000000000000002B] --archive-async --exec-id=1590-245d7f0f --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --process-max=2 --repo1-host=repository --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: get 8 WAL file(s) from archive: 000000080000000000000024...00000008000000000000002B", "P01 DETAIL: found 000000080000000000000024 in the repo1: 12-1 archive", "P00 DETAIL: unable to find 000000080000000000000025 in the archive", "P00 INFO: archive-get:async command end: completed successfully", " [filtered 2 lines of output]", "P00 INFO: archive-get:async command begin 2.37: [000000080000000000000025, 000000080000000000000026, 000000080000000000000027, 000000080000000000000028, 000000080000000000000029, 00000008000000000000002A, 00000008000000000000002B, 00000008000000000000002C] --archive-async --exec-id=1598-571f7ef0 --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --process-max=2 --repo1-host=repository --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: get 8 WAL file(s) from archive: 000000080000000000000025...00000008000000000000002C", "P02 DETAIL: found 000000080000000000000026 in the repo1: 12-1 archive", "P01 DETAIL: found 000000080000000000000025 in the repo1: 12-1 archive", "P00 DETAIL: unable to find 000000080000000000000027 in the archive", "P00 INFO: archive-get:async command end: completed successfully", " [filtered 2 lines of output]", "P00 INFO: archive-get:async command begin 2.37: [000000080000000000000027, 000000080000000000000028, 000000080000000000000029, 00000008000000000000002A, 00000008000000000000002B, 00000008000000000000002C, 00000008000000000000002D, 00000008000000000000002E] --archive-async --exec-id=1607-3bece536 --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/postgresql/12/demo --process-max=2 --repo1-host=repository --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: get 8 WAL file(s) from archive: 000000080000000000000027...00000008000000000000002E", "P02 DETAIL: found 000000080000000000000028 in the repo1: 12-1 archive", "P01 DETAIL: found 000000080000000000000027 in the repo1: 12-1 archive", "P02 DETAIL: found 000000080000000000000029 in the repo1: 12-1 archive", "P00 DETAIL: unable to find 00000008000000000000002A in the archive", "P00 INFO: archive-get:async command end: completed successfully", " [filtered 17 lines of output]" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"alter user replicator password 'jw8s0F4'\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ALTER ROLE" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "demo" : { "pg2-host" : { "value" : "pg-standby" }, "pg2-path" : { "value" : "/var/lib/postgresql/12/demo" } }, "global" : { "backup-standby" : { "value" : "y" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-host=pg-primary", "pg1-path=/var/lib/postgresql/12/demo", "pg2-host=pg-standby", "pg2-path=/var/lib/postgresql/12/demo", "", "[global]", "backup-standby=y", "process-max=3", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo --log-level-console=detail backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "backup file pg-primary|replay on the standby" ] }, "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 2 lines of output]", "P00 INFO: execute non-exclusive pg_start_backup(): backup begins after the requested immediate checkpoint completes", "P00 INFO: backup start archive = 00000008000000000000002B, lsn = 0/2B000028", "P00 INFO: wait for replay on the standby to reach 0/2B000028", "P00 INFO: replay on the standby reached 0/2B000028", "P00 INFO: check archive for prior segment 00000008000000000000002A", "P01 DETAIL: backup file pg-primary:/var/lib/postgresql/12/demo/global/pg_control (8KB, 0%) checksum 35e8a0c90ca17e4ad572c965bc503b3559890160", "P01 DETAIL: backup file pg-primary:/var/lib/postgresql/12/demo/pg_logical/replorigin_checkpoint (8B, 0%) checksum 347fc8f2df71bd4436e38bd1516ccd7ea0d46532", "P02 DETAIL: backup file pg-standby:/var/lib/postgresql/12/demo/base/13398/2608 (456KB, 19%) checksum c5379fa0a0da3e312f114d08ef381e27802113f2", "P03 DETAIL: backup file pg-standby:/var/lib/postgresql/12/demo/base/13398/1249 (440KB, 38%) checksum 276c6ea0633afcf173bcb10ac3d0f6a1936b9502", " [filtered 1293 lines of output]" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 12 demo stop" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres /usr/lib/postgresql/13/bin/initdb \\", " -D /var/lib/postgresql/13/demo -k -A peer" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_createcluster 13 demo" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sh -c 'cd /var/lib/postgresql && \\", " /usr/lib/postgresql/13/bin/pg_upgrade \\", " --old-bindir=/usr/lib/postgresql/12/bin \\", " --new-bindir=/usr/lib/postgresql/13/bin \\", " --old-datadir=/var/lib/postgresql/12/demo \\", " --new-datadir=/var/lib/postgresql/13/demo \\", " --old-options=\" -c config_file=/etc/postgresql/12/demo/postgresql.conf\" \\", " --new-options=\" -c config_file=/etc/postgresql/13/demo/postgresql.conf\"'" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Upgrade Complete" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 66 lines of output]", "Checking for extension updates ok", "", "Upgrade Complete", "----------------", "Optimizer statistics are not transferred by pg_upgrade so,", " [filtered 4 lines of output]" ] } }, { "key" : { "file" : "/etc/postgresql/13/demo/postgresql.conf", "host" : "pg-primary", "option" : { "archive_command" : { "value" : "'pgbackrest --stanza=demo archive-push %p'" }, "archive_mode" : { "value" : "on" }, "listen_addresses" : { "value" : "'*'" }, "log_line_prefix" : { "value" : "''" }, "max_wal_senders" : { "value" : "3" }, "port" : { "value" : "5432" }, "wal_level" : { "value" : "replica" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "archive_command = 'pgbackrest --stanza=demo archive-push %p'", "archive_mode = on", "listen_addresses = '*'", "log_line_prefix = ''", "max_wal_senders = 3", "port = 5432", "wal_level = replica" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/postgresql/13/demo" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/13/demo", "", "[global]", "archive-async=y", "log-level-file=detail", "repo1-host=repository", "spool-path=/var/spool/pgbackrest", "", "[global:archive-get]", "process-max=2", "", "[global:archive-push]", "process-max=2" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-standby", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/postgresql/13/demo" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/postgresql/13/demo", "recovery-option=primary_conninfo=host=172.17.0.5 port=5432 user=replicator", "", "[global]", "archive-async=y", "log-level-file=detail", "repo1-host=repository", "spool-path=/var/spool/pgbackrest", "", "[global:archive-get]", "process-max=2", "", "[global:archive-push]", "process-max=2" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/postgresql/13/demo" }, "pg2-path" : { "value" : "/var/lib/postgresql/13/demo" } }, "global" : { "backup-standby" : { "value" : "n" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-host=pg-primary", "pg1-path=/var/lib/postgresql/13/demo", "pg2-host=pg-standby", "pg2-path=/var/lib/postgresql/13/demo", "", "[global]", "backup-standby=n", "process-max=3", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo cp /etc/postgresql/12/demo/pg_hba.conf \\", " /etc/postgresql/13/demo/pg_hba.conf" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --no-online \\", " --log-level-console=info stanza-upgrade" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: stanza-upgrade command begin 2.37: --exec-id=3670-e88b2f92 --log-level-console=info --log-level-file=detail --log-level-stderr=off --no-log-timestamp --no-online --pg1-path=/var/lib/postgresql/13/demo --repo1-host=repository --stanza=demo", "P00 INFO: stanza-upgrade for stanza 'demo' on repo1", "P00 INFO: stanza-upgrade command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 13 demo start" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pg_lsclusters" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo check" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_dropcluster 12 demo" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_dropcluster 12 demo" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_createcluster 13 demo" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo check" ], "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 WARN: unable to check pg-2: [DbConnectError] raised from remote-0 ssh protocol on 'pg-standby': unable to connect to 'dbname='postgres' port=5432': connection to server on socket \"/var/run/postgresql/.s.PGSQL.5432\" failed: No such file or directory", " \tIs the server running locally and accepting connections on that socket?" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo --type=full backup" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta --type=standby restore" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/postgresql/13/demo/postgresql.conf", "host" : "pg-standby", "option" : { "hot_standby" : { "value" : "on" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "hot_standby = on" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo pg_ctlcluster 13 demo start" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo check" ], "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "global" : { "backup-standby" : { "value" : "y" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-host=pg-primary", "pg1-path=/var/lib/postgresql/13/demo", "pg2-host=pg-standby", "pg2-path=/var/lib/postgresql/13/demo", "", "[global]", "backup-standby=y", "process-max=3", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y" ] } } ] } }, "{\"os-type\":\"rhel\"}" : { "all" : { "user-guide" : [ { "key" : { "id" : "azure", "image" : "mcr.microsoft.com/azure-storage/azurite", "name" : "azure-server", "option" : "-m 128m -v {[host-repo-path]}/doc/resource/fake-cert/azure-server.crt:/root/public.crt:ro -v {[host-repo-path]}/doc/resource/fake-cert/azure-server.key:/root/private.key:ro -e AZURITE_ACCOUNTS='pgbackrest:YXpLZXk='", "os" : "rhel", "param" : "azurite-blob --blobPort 443 --blobHost 0.0.0.0 --cert=/root/public.crt --key=/root/private.key", "update-hosts" : false }, "type" : "host", "value" : { "ip" : "172.17.0.2" } }, { "key" : { "id" : "s3", "image" : "minio/minio", "name" : "s3-server", "option" : "-m 128m -v {[host-repo-path]}/doc/resource/fake-cert/s3-server.crt:/root/.minio/certs/public.crt:ro -v {[host-repo-path]}/doc/resource/fake-cert/s3-server.key:/root/.minio/certs/private.key:ro -e MINIO_REGION=us-east-1 -e MINIO_DOMAIN=s3.us-east-1.amazonaws.com -e MINIO_BROWSER=off -e MINIO_ACCESS_KEY=accessKey1 -e MINIO_SECRET_KEY=verySecretKey1", "os" : "rhel", "param" : "server /data --address :443", "update-hosts" : false }, "type" : "host", "value" : { "ip" : "172.17.0.3" } }, { "key" : { "id" : "build", "image" : "pgbackrest/doc:rhel", "name" : "build", "option" : "-m 256m -v /sys/fs/cgroup:/sys/fs/cgroup:rw -v /tmp/$(mktemp -d):/run", "os" : "rhel", "update-hosts" : true }, "type" : "host", "value" : { "ip" : "172.17.0.4" } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /build/pgbackrest-release-2.37" ], "host" : "build", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo cp -r /pgbackrest/src /build/pgbackrest-release-2.37" ], "host" : "build", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown -R vagrant /build/pgbackrest-release-2.37" ], "host" : "build", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo yum install make gcc postgresql10-devel \\", " openssl-devel libxml2-devel lz4-devel libzstd-devel bzip2-devel libyaml-devel" ], "cmd-extra" : "-y 2>&1", "host" : "build", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "cd /build/pgbackrest-release-2.37/src && ./configure && make" ], "cmd-extra" : "-j 4", "host" : "build", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "id" : "pg1", "image" : "pgbackrest/doc:rhel", "name" : "pg-primary", "option" : "-m 512m -v /sys/fs/cgroup:/sys/fs/cgroup:rw -v /tmp/$(mktemp -d):/run", "os" : "rhel", "update-hosts" : true }, "type" : "host", "value" : { "ip" : "172.17.0.5" } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo yum install postgresql-libs" ], "cmd-extra" : "-y 2>&1", "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo scp build:/build/pgbackrest-release-2.37/src/pgbackrest /usr/bin" ], "cmd-extra" : "2>&1", "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 755 /usr/bin/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p -m 770 /var/log/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /var/log/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest/conf.d" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo touch /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 640 /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "pgBackRest 2.37 - General help", "", "Usage:", " pgbackrest [options] [command]", "", "Commands:", " archive-get Get a WAL segment from the archive.", " archive-push Push a WAL segment to the archive.", " backup Backup a database cluster.", " check Check the configuration.", " expire Expire backups that exceed retention.", " help Get help.", " info Retrieve information about backups.", " repo-get Get a file from a repository.", " repo-ls List files in a repository.", " restore Restore a database cluster.", " server pgBackRest server.", " server-ping Ping pgBackRest server.", " stanza-create Create the required stanza data.", " stanza-delete Delete a stanza.", " stanza-upgrade Upgrade a stanza.", " start Allow pgBackRest processes to run.", " stop Stop pgBackRest processes from running.", " version Get version.", "", "Use 'pgbackrest help [command]' for more information." ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres /usr/pgsql-10/bin/initdb \\", " -D /var/lib/pgsql/10/data -k -A peer" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "echo 'shared_buffers = 16MB' >> /var/lib/pgsql/10/data/postgresql.conf" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "file" : "/var/lib/pgsql/10/data/postgresql.conf", "host" : "pg-primary", "option" : { "listen_addresses" : { "value" : "'*'" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "listen_addresses = '*'" ] } }, { "key" : { "file" : "/var/lib/pgsql/10/data/postgresql.conf", "host" : "pg-primary", "option" : { "log_line_prefix" : { "value" : "''" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "listen_addresses = '*'", "log_line_prefix = ''" ] } }, { "key" : { "file" : "/var/lib/pgsql/10/data/postgresql.conf", "host" : "pg-primary", "option" : { "log_filename" : { "value" : "'postgresql.log'" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "listen_addresses = '*'", "log_filename = 'postgresql.log'", "log_line_prefix = ''" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/pgsql/10/data" } }, "global" : { "log-level-stderr" : { "value" : "off" }, "log-timestamp" : { "value" : "n" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres bash -c ' \\", " export PGBACKREST_LOG_PATH=/path/set/by/env && \\", " pgbackrest --log-level-console=error help backup log-path'" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "current\\: \\/path\\/set\\/by\\/env" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "pgBackRest 2.37 - 'backup' command - 'log-path' option help", "", "Path where log files are stored.", "", "The log path provides a location for pgBackRest to store log files. Note that", "if log-level-file=off then no log path is required.", "", "current: /path/set/by/env", "default: /var/log/pgbackrest" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /var/lib/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 750 /var/lib/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /var/lib/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-path" : { "value" : "/var/lib/pgbackrest" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "repo1-path=/var/lib/pgbackrest" ] } }, { "key" : { "file" : "/var/lib/pgsql/10/data/postgresql.conf", "host" : "pg-primary", "option" : { "archive_command" : { "value" : "'pgbackrest --stanza=demo archive-push %p'" }, "archive_mode" : { "value" : "on" }, "max_wal_senders" : { "value" : "3" }, "wal_level" : { "value" : "replica" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "archive_command = 'pgbackrest --stanza=demo archive-push %p'", "archive_mode = on", "listen_addresses = '*'", "log_filename = 'postgresql.log'", "log_line_prefix = ''", "max_wal_senders = 3", "wal_level = replica" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl restart postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global:archive-push" : { "compress-level" : { "value" : "3" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "repo1-path=/var/lib/pgbackrest", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-retention-full" : { "value" : "2" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-cipher-pass" : { "value" : "zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO" }, "repo1-cipher-type" : { "value" : "aes-256-cbc" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info stanza-create" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: stanza-create command begin 2.37: --exec-id=952-6f56cb91 --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --stanza=demo", "P00 INFO: stanza-create for stanza 'demo' on repo1", "P00 INFO: stanza-create command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info check" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ " successfully archived to " ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: check command begin 2.37: --exec-id=977-a8bdb38d --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --stanza=demo", "P00 INFO: check repo1 configuration (primary)", "P00 INFO: check repo1 archive for WAL (primary)", "P00 INFO: WAL segment 000000010000000000000001 successfully archived to '/var/lib/pgbackrest/archive/demo/10-1/0000000100000000/000000010000000000000001-84287fe4b534f0cb6a7a693e206d5fddde3af7cb.gz' on repo1", "P00 INFO: check command end: completed successfully" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "start-fast" : { "value" : "y" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "no prior backup exists|full backup size" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: backup command begin 2.37: --exec-id=1041-336b8131 --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-full=2 --stanza=demo --start-fast", "P00 WARN: no prior backup exists, incr backup has been changed to full", "P00 INFO: execute non-exclusive pg_start_backup(): backup begins after the requested immediate checkpoint completes", "P00 INFO: backup start archive = 000000010000000000000002, lsn = 0/2000028", " [filtered 3 lines of output]", "P00 INFO: check archive for segment(s) 000000010000000000000002:000000010000000000000003", "P00 INFO: new backup label = 20211231-194304F", "P00 INFO: full backup size = 22.5MB, file total = 949", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=1041-336b8131 --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-full=2 --stanza=demo" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-194304F" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=diff \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "diff backup size" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 7 lines of output]", "P00 INFO: check archive for segment(s) 000000010000000000000004:000000010000000000000005", "P00 INFO: new backup label = 20211231-194304F_20211231-194310D", "P00 INFO: diff backup size = 8.8KB, file total = 949", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=1096-e5efad67 --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-full=2 --stanza=demo" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest info" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "(full|incr|diff) backup" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "stanza: demo", " status: ok", " cipher: aes-256-cbc", "", " db (current)", " wal archive min/max (10): 000000010000000000000001/000000010000000000000005", "", " full backup: 20211231-194304F", " timestamp start/stop: 2021-12-31 19:43:04 / 2021-12-31 19:43:08", " wal start/stop: 000000010000000000000002 / 000000010000000000000003", " database size: 22.5MB, database backup size: 22.5MB", " repo1: backup set size: 2.7MB, backup size: 2.7MB", "", " diff backup: 20211231-194304F_20211231-194310D", " timestamp start/stop: 2021-12-31 19:43:10 / 2021-12-31 19:43:12", " wal start/stop: 000000010000000000000004 / 000000010000000000000005", " database size: 22.5MB, database backup size: 8.8KB", " repo1: backup set size: 2.7MB, backup size: 752B", " backup reference list: 20211231-194304F" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl stop postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres rm /var/lib/pgsql/10/data/global/pg_control" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-10.service" ], "err-expect" : "1", "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl status postgresql-10.service" ], "err-expect" : "3", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Failed to start PostgreSQL" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 12 lines of output]", "Dec 31 19:43:15 pg-primary systemd[1]: postgresql-10.service: Main process exited, code=exited, status=2/INVALIDARGUMENT", "Dec 31 19:43:15 pg-primary systemd[1]: postgresql-10.service: Failed with result 'exit-code'.", "Dec 31 19:43:15 pg-primary systemd[1]: Failed to start PostgreSQL 10 database server." ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres find /var/lib/pgsql/10/data -mindepth 1 -delete" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres mkdir -p /var/lib/pgsql/pgbackrest/doc/example" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cp -r /pgbackrest/doc/example/* \\", " /var/lib/pgsql/pgbackrest/doc/example" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat \\", " /var/lib/pgsql/pgbackrest/doc/example/pgsql-pgbackrest-info.sql" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "-- An example of monitoring pgBackRest from within PostgreSQL", "--", "-- Use copy to export data from the pgBackRest info command into the jsonb", "-- type so it can be queried directly by PostgreSQL.", "", "-- Create monitor schema", "create schema monitor;", "", "-- Get pgBackRest info in JSON format", "create function monitor.pgbackrest_info()", " returns jsonb AS $$", "declare", " data jsonb;", "begin", " -- Create a temp table to hold the JSON data", " create temp table temp_pgbackrest_data (data jsonb);", "", " -- Copy data into the table directly from the pgBackRest info command", " copy temp_pgbackrest_data (data)", " from program", " 'pgbackrest --output=json info' (format text);", "", " select temp_pgbackrest_data.data", " into data", " from temp_pgbackrest_data;", "", " drop table temp_pgbackrest_data;", "", " return data;", "end $$ language plpgsql;" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -f \\", " /var/lib/pgsql/pgbackrest/doc/example/pgsql-pgbackrest-info.sql" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat \\", " /var/lib/pgsql/pgbackrest/doc/example/pgsql-pgbackrest-query.sql" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "-- Get last successful backup for each stanza", "--", "-- Requires the monitor.pgbackrest_info function.", "with stanza as", "(", " select data->'name' as name,", " data->'backup'->(", " jsonb_array_length(data->'backup') - 1) as last_backup,", " data->'archive'->(", " jsonb_array_length(data->'archive') - 1) as current_archive", " from jsonb_array_elements(monitor.pgbackrest_info()) as data", ")", "select name,", " to_timestamp(", " (last_backup->'timestamp'->>'stop')::numeric) as last_successful_backup,", " current_archive->>'max' as last_archived_wal", " from stanza;" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -f \\", " /var/lib/pgsql/pgbackrest/doc/example/pgsql-pgbackrest-query.sql" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " name | last_successful_backup | last_archived_wal ", "--------+------------------------+--------------------------", " \"demo\" | 2021-12-31 19:43:12+00 | 000000010000000000000005", "(1 row)" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-retention-full" : { "value" : "2" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=full \\", " --log-level-console=detail backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "archive retention on backup 20211231-194304F|remove archive" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 958 lines of output]", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=1555-052ad819 --log-level-console=detail --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-full=2 --stanza=demo", "P00 DETAIL: repo1: 10-1 archive retention on backup 20211231-194304F, start = 000000010000000000000002", "P00 INFO: repo1: 10-1 remove archive, start = 000000010000000000000001, stop = 000000010000000000000001", "P00 INFO: expire command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-194327F" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=full \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "expire full backup set 20211231-194304F|archive retention on backup 20211231-194327F|remove archive" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 9 lines of output]", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=1610-9b83d72f --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-full=2 --stanza=demo", "P00 INFO: repo1: expire full backup set 20211231-194304F, 20211231-194304F_20211231-194310D", "P00 INFO: repo1: remove expired backup 20211231-194304F_20211231-194310D", "P00 INFO: repo1: remove expired backup 20211231-194304F", "P00 INFO: repo1: 10-1 remove archive, start = 0000000100000000, stop = 000000020000000000000006", "P00 INFO: expire command end: completed successfully" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-retention-diff" : { "value" : "1" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-diff=1", "repo1-retention-full=2", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=diff backup" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-194333F_20211231-194340D" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=incr backup" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=diff \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "expire diff backup set 20211231-194333F_20211231-194340D" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 10 lines of output]", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=1761-d5f40afa --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-diff=1 --repo1-retention-full=2 --stanza=demo", "P00 INFO: repo1: expire diff backup set 20211231-194333F_20211231-194340D, 20211231-194333F_20211231-194343I", "P00 INFO: repo1: remove expired backup 20211231-194333F_20211231-194343I", "P00 INFO: repo1: remove expired backup 20211231-194333F_20211231-194340D", "P00 INFO: expire command end: completed successfully" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "repo1-retention-diff" : { "value" : "2" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-diff=2", "repo1-retention-full=2", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-194333F_20211231-194345D" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " select pg_create_restore_point('generate WAL'); select pg_switch_wal(); \\", " select pg_create_restore_point('generate WAL'); select pg_switch_wal();\"" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=diff \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "new backup label" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 6 lines of output]", "P00 INFO: backup stop archive = 000000020000000000000013, lsn = 0/13000050", "P00 INFO: check archive for segment(s) 000000020000000000000012:000000020000000000000013", "P00 INFO: new backup label = 20211231-194333F_20211231-194351D", "P00 INFO: diff backup size = 10.6KB, file total = 949", "P00 INFO: backup command end: completed successfully", " [filtered 2 lines of output]" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-194333F_20211231-194351D" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=detail \\", " --repo1-retention-archive-type=diff --repo1-retention-archive=1 expire" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "archive retention on backup 20211231-194333F_20211231-194345D|remove archive" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: expire command begin 2.37: --exec-id=1936-5402b1a3 --log-level-console=detail --log-level-stderr=off --no-log-timestamp --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo1-retention-archive=1 --repo1-retention-archive-type=diff --repo1-retention-diff=2 --repo1-retention-full=2 --stanza=demo", "P00 DETAIL: repo1: 10-1 archive retention on backup 20211231-194327F, start = 000000020000000000000007, stop = 000000020000000000000007", "P00 DETAIL: repo1: 10-1 archive retention on backup 20211231-194333F, start = 000000020000000000000008, stop = 000000020000000000000009", "P00 DETAIL: repo1: 10-1 archive retention on backup 20211231-194333F_20211231-194345D, start = 00000002000000000000000E, stop = 00000002000000000000000F", "P00 DETAIL: repo1: 10-1 archive retention on backup 20211231-194333F_20211231-194351D, start = 000000020000000000000012", "P00 INFO: repo1: 10-1 remove archive, start = 00000002000000000000000A, stop = 00000002000000000000000D", "P00 INFO: repo1: 10-1 remove archive, start = 000000020000000000000010, stop = 000000020000000000000011", "P00 INFO: expire command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl stop postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta \\", " --log-level-console=detail restore" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "demo\\/PG_VERSION - exists and matches backup|remove invalid files|rename global\\/pg_control" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 2 lines of output]", "P00 DETAIL: check '/var/lib/pgsql/10/data' exists", "P00 DETAIL: remove 'global/pg_control' so cluster will not start if restore does not complete", "P00 INFO: remove invalid files/links/paths from '/var/lib/pgsql/10/data'", "P00 DETAIL: remove invalid file '/var/lib/pgsql/10/data/backup_label.old'", "P00 DETAIL: remove invalid file '/var/lib/pgsql/10/data/base/13017/pg_internal.init'", " [filtered 1000 lines of output]" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"create database test1;\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "CREATE DATABASE" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"create database test2;\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "CREATE DATABASE" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"create table test1_table (id int); \\", " insert into test1_table (id) values (1);\" test1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "INSERT 0 1" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"create table test2_table (id int); \\", " insert into test2_table (id) values (2);\" test2" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "INSERT 0 1" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=incr backup" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -Atc \"select oid from pg_database where datname = 'test1'\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "24576" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres du -sh /var/lib/pgsql/10/data/base/24576" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "7.5M\t/var/lib/pgsql/10/data/base/24576" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-194333F_20211231-194405I" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo \\", " --set=20211231-194333F_20211231-194405I info" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "database list" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 11 lines of output]", " repo1: backup set size: 4.5MB, backup size: 1.8MB", " backup reference list: 20211231-194333F, 20211231-194333F_20211231-194351D", " database list: postgres (13017), test1 (24576), test2 (24577)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl stop postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta \\", " --db-include=test2 --type=immediate --target-action=promote restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from test2_table;\" test2" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " id ", "----", " 2", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from test1_table;\" test1" ], "err-expect" : "2", "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "relation mapping file.*contains invalid data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "psql: FATAL: relation mapping file \"base/24576/pg_filenode.map\" contains invalid data" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres du -sh /var/lib/pgsql/10/data/base/24576" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "16K\t/var/lib/pgsql/10/data/base/24576" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"drop database test1;\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "DROP DATABASE" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select oid, datname from pg_database order by oid;\"" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "test2" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " oid | datname ", "-------+-----------", " 1 | template1", " 13016 | template0", " 13017 | postgres", " 24577 | test2", "(4 rows)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=diff backup" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"begin; \\", " create table important_table (message text); \\", " insert into important_table values ('Important Data'); \\", " commit; \\", " select * from important_table;\"" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message ", "----------------", " Important Data", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 1" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -Atc \"select current_timestamp\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "2021-12-31 19:44:27.079264+00" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 1" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"begin; \\", " drop table important_table; \\", " commit; \\", " select * from important_table;\"" ], "err-expect" : "1", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "does not exist" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ERROR: relation \"important_table\" does not exist", "LINE 1: ...le important_table; commit; select * from important_...", " ^" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl stop postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta \\", " --type=time \"--target=2021-12-31 19:44:27.079264+00\" \\", " --target-action=promote restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm /var/lib/pgsql/10/data/log/postgresql.log" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/lib/pgsql/10/data/recovery.conf" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "recovery_target_time" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:44:31", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "recovery_target_time = '2021-12-31 19:44:27.079264+00'", "recovery_target_action = 'promote'" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from important_table\"" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message ", "----------------", " Important Data", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/lib/pgsql/10/data/log/postgresql.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "recovery stopping before|last completed transaction|starting point-in-time recovery" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "LOG: database system was interrupted; last known up at 2021-12-31 19:44:21 UTC", "LOG: starting point-in-time recovery to 2021-12-31 19:44:27.079264+00", "LOG: restored log file \"00000004.history\" from archive", "LOG: restored log file \"000000040000000000000016\" from archive", " [filtered 2 lines of output]", "LOG: database system is ready to accept read only connections", "LOG: restored log file \"000000040000000000000017\" from archive", "LOG: recovery stopping before commit of transaction 564, time 2021-12-31 19:44:29.264483+00", "LOG: redo done at 0/17021810", "LOG: last completed transaction was at log time 2021-12-31 19:44:24.848796+00", "LOG: selected new timeline ID: 5", "LOG: archive recovery complete", "LOG: database system is ready to accept connections" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"begin; \\", " drop table important_table; \\", " commit; \\", " select * from important_table;\"" ], "err-expect" : "1", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "does not exist" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ERROR: relation \"important_table\" does not exist", "LINE 1: ...le important_table; commit; select * from important_...", " ^" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=incr backup" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest repo-ls backup/demo --filter=\"(F|D|I)$\" --sort=desc | head -1" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "20211231-194333F_20211231-194438I" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest info" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "20211231-194333F_20211231-194438I" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "stanza: demo", " status: ok", " cipher: aes-256-cbc", "", " db (current)", " wal archive min/max (10): 000000020000000000000007/000000050000000000000018", "", " full backup: 20211231-194327F", " timestamp start/stop: 2021-12-31 19:43:27 / 2021-12-31 19:43:31", " wal start/stop: 000000020000000000000007 / 000000020000000000000007", " database size: 22.5MB, database backup size: 22.5MB", " repo1: backup set size: 2.7MB, backup size: 2.7MB", "", " full backup: 20211231-194333F", " timestamp start/stop: 2021-12-31 19:43:33 / 2021-12-31 19:43:37", " wal start/stop: 000000020000000000000008 / 000000020000000000000009", " database size: 22.5MB, database backup size: 22.5MB", " repo1: backup set size: 2.7MB, backup size: 2.7MB", "", " diff backup: 20211231-194333F_20211231-194351D", " timestamp start/stop: 2021-12-31 19:43:51 / 2021-12-31 19:43:53", " wal start/stop: 000000020000000000000012 / 000000020000000000000013", " database size: 22.5MB, database backup size: 10.6KB", " repo1: backup set size: 2.7MB, backup size: 992B", " backup reference list: 20211231-194333F", "", " incr backup: 20211231-194333F_20211231-194405I", " timestamp start/stop: 2021-12-31 19:44:05 / 2021-12-31 19:44:08", " wal start/stop: 000000030000000000000015 / 000000030000000000000015", " database size: 37.2MB, database backup size: 15.2MB", " repo1: backup set size: 4.5MB, backup size: 1.8MB", " backup reference list: 20211231-194333F, 20211231-194333F_20211231-194351D", "", " diff backup: 20211231-194333F_20211231-194421D", " timestamp start/stop: 2021-12-31 19:44:21 / 2021-12-31 19:44:24", " wal start/stop: 000000040000000000000016 / 000000040000000000000016", " database size: 29.9MB, database backup size: 7.8MB", " repo1: backup set size: 3.6MB, backup size: 952.3KB", " backup reference list: 20211231-194333F", "", " incr backup: 20211231-194333F_20211231-194438I", " timestamp start/stop: 2021-12-31 19:44:38 / 2021-12-31 19:44:40", " wal start/stop: 000000050000000000000018 / 000000050000000000000018", " database size: 29.9MB, database backup size: 2.1MB", " repo1: backup set size: 3.6MB, backup size: 218.7KB", " backup reference list: 20211231-194333F, 20211231-194333F_20211231-194421D" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl stop postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta \\", " --set=20211231-194333F_20211231-194438I \\", " --type=time \"--target=2021-12-31 19:44:27.079264+00\" --target-action=promote restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm /var/lib/pgsql/10/data/log/postgresql.log" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from important_table\"" ], "err-expect" : "1", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "does not exist" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ERROR: relation \"important_table\" does not exist", "LINE 1: select * from important_table", " ^" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/lib/pgsql/10/data/log/postgresql.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "starting point-in-time recovery|consistent recovery state reached" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "LOG: database system was interrupted; last known up at 2021-12-31 19:44:38 UTC", "LOG: starting point-in-time recovery to 2021-12-31 19:44:27.079264+00", "LOG: restored log file \"00000005.history\" from archive", "LOG: restored log file \"000000050000000000000018\" from archive", "LOG: redo starts at 0/18000028", "LOG: consistent recovery state reached at 0/180000F8", "LOG: database system is ready to accept read only connections", "LOG: redo done at 0/180000F8", " [filtered 6 lines of output]" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl stop postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta \\", " --type=time \"--target=2021-12-31 19:44:27.079264+00\" \\", " --target-action=promote restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm /var/lib/pgsql/10/data/log/postgresql.log" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from important_table\"" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message ", "----------------", " Important Data", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/lib/pgsql/10/data/log/postgresql.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "recovery stopping before|last completed transaction|starting point-in-time recovery" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "LOG: database system was interrupted; last known up at 2021-12-31 19:44:21 UTC", "LOG: starting point-in-time recovery to 2021-12-31 19:44:27.079264+00", "LOG: restored log file \"00000004.history\" from archive", "LOG: restored log file \"000000040000000000000016\" from archive", " [filtered 2 lines of output]", "LOG: database system is ready to accept read only connections", "LOG: restored log file \"000000040000000000000017\" from archive", "LOG: recovery stopping before commit of transaction 564, time 2021-12-31 19:44:29.264483+00", "LOG: redo done at 0/17021810", "LOG: last completed transaction was at log time 2021-12-31 19:44:24.848796+00", "LOG: restored log file \"00000005.history\" from archive", "LOG: restored log file \"00000006.history\" from archive", " [filtered 3 lines of output]" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "process-max" : { "value" : "4" }, "repo2-azure-account" : { "value" : "pgbackrest" }, "repo2-azure-container" : { "value" : "demo-container" }, "repo2-azure-key" : { "value" : "YXpLZXk=" }, "repo2-path" : { "value" : "/demo-repo" }, "repo2-retention-full" : { "value" : "4" }, "repo2-type" : { "value" : "azure" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "process-max=4", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-diff=2", "repo1-retention-full=2", "repo2-azure-account=pgbackrest", "repo2-azure-container=demo-container", "repo2-azure-key=YXpLZXk=", "repo2-path=/demo-repo", "repo2-retention-full=4", "repo2-type=azure", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "echo \"172.17.0.2 pgbackrest.blob.core.windows.net\" | tee -a /etc/hosts" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --repo=2 repo-create" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info stanza-create" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 2 lines of output]", "P00 INFO: stanza 'demo' already exists on repo1 and is valid", "P00 INFO: stanza-create for stanza 'demo' on repo2", "P00 INFO: stanza-create command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --repo=2 \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "no prior backup exists|full backup size" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: backup command begin 2.37: --exec-id=3515-0d0165d7 --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --process-max=4 --repo=2 --repo2-azure-account= --repo2-azure-container=demo-container --repo2-azure-key= --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo2-path=/demo-repo --repo1-retention-diff=2 --repo1-retention-full=2 --repo2-retention-full=4 --repo2-type=azure --stanza=demo --start-fast", "P00 WARN: no prior backup exists, incr backup has been changed to full", "P00 INFO: execute non-exclusive pg_start_backup(): backup begins after the requested immediate checkpoint completes", "P00 INFO: backup start archive = 000000070000000000000018, lsn = 0/18000028", " [filtered 3 lines of output]", "P00 INFO: check archive for segment(s) 000000070000000000000018:000000070000000000000018", "P00 INFO: new backup label = 20211231-194501F", "P00 INFO: full backup size = 29.9MB, file total = 1246", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=3515-0d0165d7 --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo=2 --repo2-azure-account= --repo2-azure-container=demo-container --repo2-azure-key= --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo2-path=/demo-repo --repo1-retention-diff=2 --repo1-retention-full=2 --repo2-retention-full=4 --repo2-type=azure --stanza=demo" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "process-max" : { "value" : "4" }, "repo3-path" : { "value" : "/demo-repo" }, "repo3-retention-full" : { "value" : "4" }, "repo3-s3-bucket" : { "value" : "demo-bucket" }, "repo3-s3-endpoint" : { "value" : "s3.us-east-1.amazonaws.com" }, "repo3-s3-key" : { "value" : "accessKey1" }, "repo3-s3-key-secret" : { "value" : "verySecretKey1" }, "repo3-s3-region" : { "value" : "us-east-1" }, "repo3-type" : { "value" : "s3" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "process-max=4", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-diff=2", "repo1-retention-full=2", "repo2-azure-account=pgbackrest", "repo2-azure-container=demo-container", "repo2-azure-key=YXpLZXk=", "repo2-path=/demo-repo", "repo2-retention-full=4", "repo2-type=azure", "repo3-path=/demo-repo", "repo3-retention-full=4", "repo3-s3-bucket=demo-bucket", "repo3-s3-endpoint=s3.us-east-1.amazonaws.com", "repo3-s3-key=accessKey1", "repo3-s3-key-secret=verySecretKey1", "repo3-s3-region=us-east-1", "repo3-type=s3", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "echo \"172.17.0.3 demo-bucket.s3.us-east-1.amazonaws.com s3.us-east-1.amazonaws.com\" | tee -a /etc/hosts" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --repo=3 repo-create" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info stanza-create" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 4 lines of output]", "P00 INFO: stanza 'demo' already exists on repo2 and is valid", "P00 INFO: stanza-create for stanza 'demo' on repo3", "P00 INFO: stanza-create command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --repo=3 \\", " --log-level-console=info backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "no prior backup exists|full backup size" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: backup command begin 2.37: --exec-id=3655-556e7ff2 --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --process-max=4 --repo=3 --repo2-azure-account= --repo2-azure-container=demo-container --repo2-azure-key= --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo2-path=/demo-repo --repo3-path=/demo-repo --repo1-retention-diff=2 --repo1-retention-full=2 --repo2-retention-full=4 --repo3-retention-full=4 --repo3-s3-bucket=demo-bucket --repo3-s3-endpoint=s3.us-east-1.amazonaws.com --repo3-s3-key= --repo3-s3-key-secret= --repo3-s3-region=us-east-1 --repo2-type=azure --repo3-type=s3 --stanza=demo --start-fast", "P00 WARN: no prior backup exists, incr backup has been changed to full", "P00 INFO: execute non-exclusive pg_start_backup(): backup begins after the requested immediate checkpoint completes", "P00 INFO: backup start archive = 000000070000000000000019, lsn = 0/19000028", " [filtered 3 lines of output]", "P00 INFO: check archive for segment(s) 000000070000000000000019:00000007000000000000001A", "P00 INFO: new backup label = 20211231-194519F", "P00 INFO: full backup size = 29.9MB, file total = 1246", "P00 INFO: backup command end: completed successfully", "P00 INFO: expire command begin 2.37: --exec-id=3655-556e7ff2 --log-level-console=info --log-level-stderr=off --no-log-timestamp --repo=3 --repo2-azure-account= --repo2-azure-container=demo-container --repo2-azure-key= --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo1-path=/var/lib/pgbackrest --repo2-path=/demo-repo --repo3-path=/demo-repo --repo1-retention-diff=2 --repo1-retention-full=2 --repo2-retention-full=4 --repo3-retention-full=4 --repo3-s3-bucket=demo-bucket --repo3-s3-endpoint=s3.us-east-1.amazonaws.com --repo3-s3-key= --repo3-s3-key-secret= --repo3-s3-region=us-east-1 --repo2-type=azure --repo3-type=s3 --stanza=demo" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "process-max" : { "value" : "4" }, "repo4-gcs-bucket" : { "value" : "demo-bucket" }, "repo4-gcs-key" : { "value" : "/etc/pgbackrest/gcs-key.json" }, "repo4-path" : { "value" : "/demo-repo" }, "repo4-type" : { "value" : "gcs" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "process-max=4", "repo1-cipher-pass=zWaf6XtpjIVZC5444yXB+cgFDFl7MxGlgkZSaoPvTGirhPygu4jOKOXf9LO4vjfO", "repo1-cipher-type=aes-256-cbc", "repo1-path=/var/lib/pgbackrest", "repo1-retention-diff=2", "repo1-retention-full=2", "repo2-azure-account=pgbackrest", "repo2-azure-container=demo-container", "repo2-azure-key=YXpLZXk=", "repo2-path=/demo-repo", "repo2-retention-full=4", "repo2-type=azure", "repo3-path=/demo-repo", "repo3-retention-full=4", "repo3-s3-bucket=demo-bucket", "repo3-s3-endpoint=s3.us-east-1.amazonaws.com", "repo3-s3-key=accessKey1", "repo3-s3-key-secret=verySecretKey1", "repo3-s3-region=us-east-1", "repo3-type=s3", "repo4-gcs-bucket=demo-bucket", "repo4-gcs-key=/etc/pgbackrest/gcs-key.json", "repo4-path=/demo-repo", "repo4-type=gcs", "start-fast=y", "", "[global:archive-push]", "compress-level=3" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl stop postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info stop" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: stop command begin 2.37: --exec-id=3749-8d7b18b8 --log-level-console=info --log-level-stderr=off --no-log-timestamp --stanza=demo", "P00 INFO: stop command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --repo=1 \\", " --log-level-console=info stanza-delete" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: stanza-delete command begin 2.37: --exec-id=3773-1ad02626 --log-level-console=info --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --repo=1 --repo2-azure-account= --repo2-azure-container=demo-container --repo2-azure-key= --repo1-cipher-pass= --repo1-cipher-type=aes-256-cbc --repo4-gcs-bucket=demo-bucket --repo4-gcs-key= --repo1-path=/var/lib/pgbackrest --repo2-path=/demo-repo --repo3-path=/demo-repo --repo4-path=/demo-repo --repo3-s3-bucket=demo-bucket --repo3-s3-endpoint=s3.us-east-1.amazonaws.com --repo3-s3-key= --repo3-s3-key-secret= --repo3-s3-region=us-east-1 --repo2-type=azure --repo3-type=s3 --repo4-type=gcs --stanza=demo", "P00 INFO: stanza-delete command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "id" : "repo1", "image" : "pgbackrest/doc:rhel", "name" : "repository", "option" : "-m 512m -v /sys/fs/cgroup:/sys/fs/cgroup:rw -v /tmp/$(mktemp -d):/run", "os" : "rhel", "update-hosts" : true }, "type" : "host", "value" : { "ip" : "172.17.0.6" } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo groupadd pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo adduser -gpgbackrest -n pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo yum install postgresql-libs" ], "cmd-extra" : "-y 2>&1", "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo scp build:/build/pgbackrest-release-2.37/src/pgbackrest /usr/bin" ], "cmd-extra" : "2>&1", "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 755 /usr/bin/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p -m 770 /var/log/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown pgbackrest:pgbackrest /var/log/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest/conf.d" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo touch /etc/pgbackrest/pgbackrest.conf" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 640 /etc/pgbackrest/pgbackrest.conf" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown pgbackrest:pgbackrest /etc/pgbackrest/pgbackrest.conf" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /var/lib/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 750 /var/lib/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown pgbackrest:pgbackrest /var/lib/pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "global" : { "repo1-path" : { "value" : "/var/lib/pgbackrest" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[global]", "repo1-path=/var/lib/pgbackrest" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "demo" : { "pg1-host" : { "value" : "pg-primary" }, "pg1-host-ca-file" : { "value" : "/etc/pgbackrest/cert/ca.crt" }, "pg1-host-cert-file" : { "value" : "/etc/pgbackrest/cert/client.crt" }, "pg1-host-key-file" : { "value" : "/etc/pgbackrest/cert/client.key" }, "pg1-host-type" : { "value" : "tls" }, "pg1-path" : { "value" : "/var/lib/pgsql/10/data" } }, "global" : { "log-level-stderr" : { "value" : "off" }, "log-timestamp" : { "value" : "n" }, "repo1-retention-full" : { "value" : "2" }, "start-fast" : { "value" : "y" }, "tls-server-address" : { "value" : "*" }, "tls-server-auth" : { "value" : "pgbackrest-client=demo" }, "tls-server-ca-file" : { "value" : "/etc/pgbackrest/cert/ca.crt" }, "tls-server-cert-file" : { "value" : "/etc/pgbackrest/cert/server.crt" }, "tls-server-key-file" : { "value" : "/etc/pgbackrest/cert/server.key" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-host=pg-primary", "pg1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "pg1-host-cert-file=/etc/pgbackrest/cert/client.crt", "pg1-host-key-file=/etc/pgbackrest/cert/client.key", "pg1-host-type=tls", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/pgsql/10/data" }, "repo1-host-ca-file" : { "value" : "/etc/pgbackrest/cert/ca.crt" }, "repo1-host-cert-file" : { "value" : "/etc/pgbackrest/cert/client.crt" }, "repo1-host-key-file" : { "value" : "/etc/pgbackrest/cert/client.key" }, "repo1-host-type" : { "value" : "tls" } }, "global" : { "log-level-file" : { "value" : "detail" }, "log-level-stderr" : { "value" : "off" }, "log-timestamp" : { "value" : "n" }, "repo1-host" : { "value" : "repository" }, "tls-server-address" : { "value" : "*" }, "tls-server-auth" : { "value" : "pgbackrest-client=demo" }, "tls-server-ca-file" : { "value" : "/etc/pgbackrest/cert/ca.crt" }, "tls-server-cert-file" : { "value" : "/etc/pgbackrest/cert/server.crt" }, "tls-server-key-file" : { "value" : "/etc/pgbackrest/cert/server.key" } } }, "reset" : true }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "repo1-host-cert-file=/etc/pgbackrest/cert/client.crt", "repo1-host-key-file=/etc/pgbackrest/cert/client.key", "repo1-host-type=tls", "", "[global]", "log-level-file=detail", "repo1-host=repository", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "mkdir -p -m 770 /etc/pgbackrest/cert && \\", " cp /pgbackrest/doc/resource/fake-cert/ca.crt \\", " /etc/pgbackrest/cert/ca.crt && \\", " \\", " openssl genrsa -out /etc/pgbackrest/cert/server.key 2048 2>&1 && \\", " chmod 600 /etc/pgbackrest/cert/server.key && \\", " openssl req -new -sha256 -nodes -out /etc/pgbackrest/cert/server.csr \\", " -key /etc/pgbackrest/cert/server.key -subj \"/CN=repository\" 2>&1 && \\", " openssl x509 -req -in /etc/pgbackrest/cert/server.csr \\", " -CA /etc/pgbackrest/cert/ca.crt \\", " -CAkey /pgbackrest/doc/resource/fake-cert/ca.key -CAcreateserial \\", " -out /etc/pgbackrest/cert/server.crt -days 9 2>&1 && \\", " \\", " openssl genrsa -out /etc/pgbackrest/cert/client.key 2048 2>&1 && \\", " chmod 600 /etc/pgbackrest/cert/client.key && \\", " openssl req -new -sha256 -nodes -out /etc/pgbackrest/cert/client.csr \\", " -key /etc/pgbackrest/cert/client.key -subj \"/CN=pgbackrest-client\" 2>&1 && \\", " openssl x509 -req -in /etc/pgbackrest/cert/client.csr \\", " -CA /etc/pgbackrest/cert/ca.crt \\", " -CAkey /pgbackrest/doc/resource/fake-cert/ca.key -CAcreateserial \\", " -out /etc/pgbackrest/cert/client.crt -days 9 2>&1 && \\", " \\", " chown -R pgbackrest /etc/pgbackrest/cert" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "echo '[Unit]' | tee /etc/systemd/system/pgbackrest.service && \\", " echo 'Description=pgBackRest Server' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'After=network.target' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'StartLimitIntervalSec=0' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '[Service]' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'Type=simple' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'Restart=always' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'RestartSec=1' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'User=pgbackrest' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'ExecStart=/usr/bin/pgbackrest server' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'ExecReload=kill -HUP $MAINPID' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '[Install]' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'WantedBy=multi-user.target' | tee -a /etc/systemd/system/pgbackrest.service" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo cat /etc/systemd/system/pgbackrest.service" ], "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "[Unit]", "Description=pgBackRest Server", "After=network.target", "StartLimitIntervalSec=0", "", "[Service]", "Type=simple", "Restart=always", "RestartSec=1", "User=pgbackrest", "ExecStart=/usr/bin/pgbackrest server", "ExecReload=kill -HUP $MAINPID", "", "[Install]", "WantedBy=multi-user.target" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl enable pgbackrest" ], "cmd-extra" : "2>&1", "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start pgbackrest" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "mkdir -p -m 770 /etc/pgbackrest/cert && \\", " cp /pgbackrest/doc/resource/fake-cert/ca.crt \\", " /etc/pgbackrest/cert/ca.crt && \\", " \\", " openssl genrsa -out /etc/pgbackrest/cert/server.key 2048 2>&1 && \\", " chmod 600 /etc/pgbackrest/cert/server.key && \\", " openssl req -new -sha256 -nodes -out /etc/pgbackrest/cert/server.csr \\", " -key /etc/pgbackrest/cert/server.key -subj \"/CN=pg-primary\" 2>&1 && \\", " openssl x509 -req -in /etc/pgbackrest/cert/server.csr \\", " -CA /etc/pgbackrest/cert/ca.crt \\", " -CAkey /pgbackrest/doc/resource/fake-cert/ca.key -CAcreateserial \\", " -out /etc/pgbackrest/cert/server.crt -days 9 2>&1 && \\", " \\", " openssl genrsa -out /etc/pgbackrest/cert/client.key 2048 2>&1 && \\", " chmod 600 /etc/pgbackrest/cert/client.key && \\", " openssl req -new -sha256 -nodes -out /etc/pgbackrest/cert/client.csr \\", " -key /etc/pgbackrest/cert/client.key -subj \"/CN=pgbackrest-client\" 2>&1 && \\", " openssl x509 -req -in /etc/pgbackrest/cert/client.csr \\", " -CA /etc/pgbackrest/cert/ca.crt \\", " -CAkey /pgbackrest/doc/resource/fake-cert/ca.key -CAcreateserial \\", " -out /etc/pgbackrest/cert/client.crt -days 9 2>&1 && \\", " \\", " chown -R postgres /etc/pgbackrest/cert" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "echo '[Unit]' | tee /etc/systemd/system/pgbackrest.service && \\", " echo 'Description=pgBackRest Server' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'After=network.target' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'StartLimitIntervalSec=0' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '[Service]' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'Type=simple' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'Restart=always' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'RestartSec=1' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'User=postgres' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'ExecStart=/usr/bin/pgbackrest server' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'ExecReload=kill -HUP $MAINPID' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '[Install]' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'WantedBy=multi-user.target' | tee -a /etc/systemd/system/pgbackrest.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo cat /etc/systemd/system/pgbackrest.service" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "[Unit]", "Description=pgBackRest Server", "After=network.target", "StartLimitIntervalSec=0", "", "[Service]", "Type=simple", "Restart=always", "RestartSec=1", "User=postgres", "ExecStart=/usr/bin/pgbackrest server", "ExecReload=kill -HUP $MAINPID", "", "[Install]", "WantedBy=multi-user.target" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl enable pgbackrest" ], "cmd-extra" : "2>&1", "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo stanza-create" ], "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo check" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo check" ], "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo backup" ], "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 WARN: no prior backup exists, incr backup has been changed to full" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl stop postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta restore" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo --type=full backup" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "global" : { "process-max" : { "value" : "3" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-host=pg-primary", "pg1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "pg1-host-cert-file=/etc/pgbackrest/cert/client.crt", "pg1-host-key-file=/etc/pgbackrest/cert/client.key", "pg1-host-type=tls", "pg1-path=/var/lib/pgsql/10/data", "", "[global]", "process-max=3", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo --type=full backup" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest info" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "timestamp start/stop" ] }, "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "stanza: demo", " status: ok", " cipher: none", "", " db (current)", " wal archive min/max (10): 000000080000000000000021/000000080000000000000023", "", " full backup: 20211231-194712F", " timestamp start/stop: 2021-12-31 19:47:12 / 2021-12-31 19:47:17", " wal start/stop: 000000080000000000000021 / 000000080000000000000021", " database size: 29.9MB, database backup size: 29.9MB", " repo1: backup set size: 3.5MB, backup size: 3.5MB", "", " full backup: 20211231-194720F", " timestamp start/stop: 2021-12-31 19:47:20 / 2021-12-31 19:47:25", " wal start/stop: 000000080000000000000022 / 000000080000000000000023", " database size: 29.9MB, database backup size: 29.9MB", " repo1: backup set size: 3.5MB, backup size: 3.5MB" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo backup" ], "err-expect" : "56", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "\\: stop file exists for all stanzas" ] }, "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 WARN: unable to check pg-1: [StopError] raised from remote-0 tls protocol on 'pg-primary': stop file exists for all stanzas", "P00 ERROR: [056]: unable to find primary cluster - cannot proceed" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest stop" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 WARN: stop file already exists for all stanzas" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo stop" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo backup" ], "err-expect" : "56", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "\\: stop file exists for stanza demo" ] }, "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 WARN: unable to check pg-1: [StopError] raised from remote-0 tls protocol on 'pg-primary': stop file exists for stanza demo", "P00 ERROR: [056]: unable to find primary cluster - cannot proceed" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo start" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "id" : "pg2", "image" : "pgbackrest/doc:rhel", "name" : "pg-standby", "option" : "-m 512m -v /sys/fs/cgroup:/sys/fs/cgroup:rw -v /tmp/$(mktemp -d):/run", "os" : "rhel", "update-hosts" : true }, "type" : "host", "value" : { "ip" : "172.17.0.7" } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo yum install postgresql-libs" ], "cmd-extra" : "-y 2>&1", "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo scp build:/build/pgbackrest-release-2.37/src/pgbackrest /usr/bin" ], "cmd-extra" : "2>&1", "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 755 /usr/bin/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p -m 770 /var/log/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /var/log/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p /etc/pgbackrest/conf.d" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo touch /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chmod 640 /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /etc/pgbackrest/pgbackrest.conf" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-standby", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/pgsql/10/data" }, "repo1-host-ca-file" : { "value" : "/etc/pgbackrest/cert/ca.crt" }, "repo1-host-cert-file" : { "value" : "/etc/pgbackrest/cert/client.crt" }, "repo1-host-key-file" : { "value" : "/etc/pgbackrest/cert/client.key" }, "repo1-host-type" : { "value" : "tls" } }, "global" : { "log-level-file" : { "value" : "detail" }, "log-level-stderr" : { "value" : "off" }, "log-timestamp" : { "value" : "n" }, "repo1-host" : { "value" : "repository" }, "tls-server-address" : { "value" : "*" }, "tls-server-auth" : { "value" : "pgbackrest-client=demo" }, "tls-server-ca-file" : { "value" : "/etc/pgbackrest/cert/ca.crt" }, "tls-server-cert-file" : { "value" : "/etc/pgbackrest/cert/server.crt" }, "tls-server-key-file" : { "value" : "/etc/pgbackrest/cert/server.key" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "repo1-host-cert-file=/etc/pgbackrest/cert/client.crt", "repo1-host-key-file=/etc/pgbackrest/cert/client.key", "repo1-host-type=tls", "", "[global]", "log-level-file=detail", "repo1-host=repository", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "mkdir -p -m 770 /etc/pgbackrest/cert && \\", " cp /pgbackrest/doc/resource/fake-cert/ca.crt \\", " /etc/pgbackrest/cert/ca.crt && \\", " \\", " openssl genrsa -out /etc/pgbackrest/cert/server.key 2048 2>&1 && \\", " chmod 600 /etc/pgbackrest/cert/server.key && \\", " openssl req -new -sha256 -nodes -out /etc/pgbackrest/cert/server.csr \\", " -key /etc/pgbackrest/cert/server.key -subj \"/CN=pg-standby\" 2>&1 && \\", " openssl x509 -req -in /etc/pgbackrest/cert/server.csr \\", " -CA /etc/pgbackrest/cert/ca.crt \\", " -CAkey /pgbackrest/doc/resource/fake-cert/ca.key -CAcreateserial \\", " -out /etc/pgbackrest/cert/server.crt -days 9 2>&1 && \\", " \\", " openssl genrsa -out /etc/pgbackrest/cert/client.key 2048 2>&1 && \\", " chmod 600 /etc/pgbackrest/cert/client.key && \\", " openssl req -new -sha256 -nodes -out /etc/pgbackrest/cert/client.csr \\", " -key /etc/pgbackrest/cert/client.key -subj \"/CN=pgbackrest-client\" 2>&1 && \\", " openssl x509 -req -in /etc/pgbackrest/cert/client.csr \\", " -CA /etc/pgbackrest/cert/ca.crt \\", " -CAkey /pgbackrest/doc/resource/fake-cert/ca.key -CAcreateserial \\", " -out /etc/pgbackrest/cert/client.crt -days 9 2>&1 && \\", " \\", " chown -R postgres /etc/pgbackrest/cert" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "echo '[Unit]' | tee /etc/systemd/system/pgbackrest.service && \\", " echo 'Description=pgBackRest Server' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'After=network.target' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'StartLimitIntervalSec=0' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '[Service]' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'Type=simple' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'Restart=always' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'RestartSec=1' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'User=postgres' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'ExecStart=/usr/bin/pgbackrest server' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'ExecReload=kill -HUP $MAINPID' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo '[Install]' | tee -a /etc/systemd/system/pgbackrest.service && \\", " echo 'WantedBy=multi-user.target' | tee -a /etc/systemd/system/pgbackrest.service" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo cat /etc/systemd/system/pgbackrest.service" ], "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "[Unit]", "Description=pgBackRest Server", "After=network.target", "StartLimitIntervalSec=0", "", "[Service]", "Type=simple", "Restart=always", "RestartSec=1", "User=postgres", "ExecStart=/usr/bin/pgbackrest server", "ExecReload=kill -HUP $MAINPID", "", "[Install]", "WantedBy=multi-user.target" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl enable pgbackrest" ], "cmd-extra" : "2>&1", "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres mkdir -p -m 700 /var/lib/pgsql/10/data" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=standby restore" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/lib/pgsql/10/data/recovery.conf" ], "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:47:54", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "standby_mode = 'on'" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "echo 'shared_buffers = 16MB' >> /var/lib/pgsql/10/data/postgresql.conf" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : "root" }, "type" : "exe" }, { "key" : { "file" : "/var/lib/pgsql/10/data/postgresql.conf", "host" : "pg-standby", "option" : { "archive_command" : { "value" : "'pgbackrest --stanza=demo archive-push %p'" }, "archive_mode" : { "value" : "on" }, "hot_standby" : { "value" : "on" }, "log_filename" : { "value" : "'postgresql.log'" }, "log_line_prefix" : { "value" : "''" }, "max_wal_senders" : { "value" : "3" }, "wal_level" : { "value" : "replica" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "archive_command = 'pgbackrest --stanza=demo archive-push %p'", "archive_mode = on", "hot_standby = on", "log_filename = 'postgresql.log'", "log_line_prefix = ''", "max_wal_senders = 3", "wal_level = replica" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm /var/lib/pgsql/10/data/log/postgresql.log" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-10.service" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/lib/pgsql/10/data/log/postgresql.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "entering standby mode|database system is ready to accept read only connections" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "LOG: database system was interrupted; last known up at 2021-12-31 19:47:20 UTC", "LOG: entering standby mode", "LOG: restored log file \"00000008.history\" from archive", "LOG: restored log file \"000000080000000000000022\" from archive", "LOG: redo starts at 0/22000028", "LOG: restored log file \"000000080000000000000023\" from archive", "LOG: consistent recovery state reached at 0/23000088", "LOG: database system is ready to accept read only connections" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " begin; \\", " create table replicated_table (message text); \\", " insert into replicated_table values ('Important Data'); \\", " commit; \\", " select * from replicated_table\";" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message ", "----------------", " Important Data", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select * from replicated_table;\"" ], "err-expect" : "1", "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "does not exist" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ERROR: relation \"replicated_table\" does not exist", "LINE 1: select * from replicated_table;", " ^" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"select *, current_timestamp from pg_switch_wal()\";" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " pg_switch_wal | current_timestamp ", "---------------+-------------------------------", " 0/2402B9C0 | 2021-12-31 19:48:05.183178+00", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " select *, current_timestamp from replicated_table\"" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message | current_timestamp ", "----------------+-------------------------------", " Important Data | 2021-12-31 19:48:09.980843+00", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info check" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "because this is a standby" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: check command begin 2.37: --exec-id=1003-f1cf9bb3 --log-level-console=info --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --repo1-host=repository --repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt --repo1-host-cert-file=/etc/pgbackrest/cert/client.crt --repo1-host-key-file=/etc/pgbackrest/cert/client.key --repo1-host-type=tls --stanza=demo", "P00 INFO: check repo1 (standby)", "P00 INFO: switch wal not performed because this is a standby", "P00 INFO: check command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " create user replicator password 'jw8s0F4' replication\";" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "CREATE ROLE" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sh -c 'echo \\", " \"host replication replicator 172.17.0.7/32 md5\" \\", " >> /var/lib/pgsql/10/data/pg_hba.conf'" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl reload postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-standby", "option" : { "demo" : { "recovery-option" : { "value" : "primary_conninfo=host=172.17.0.5 port=5432 user=replicator" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "recovery-option=primary_conninfo=host=172.17.0.5 port=5432 user=replicator", "repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "repo1-host-cert-file=/etc/pgbackrest/cert/client.crt", "repo1-host-key-file=/etc/pgbackrest/cert/client.key", "repo1-host-type=tls", "", "[global]", "log-level-file=detail", "repo1-host=repository", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sh -c 'echo \\", " \"172.17.0.5:*:replication:replicator:jw8s0F4\" \\", " >> /var/lib/pgsql/.pgpass'" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres chmod 600 /var/lib/pgsql/.pgpass" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl stop postgresql-10.service" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --delta --type=standby restore" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/lib/pgsql/10/data/recovery.conf" ], "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "# Recovery settings generated by pgBackRest restore on 2021-12-31 19:48:19", "primary_conninfo = 'host=172.17.0.5 port=5432 user=replicator'", "restore_command = 'pgbackrest --stanza=demo archive-get %f \"%p\"'", "standby_mode = 'on'" ] } }, { "key" : { "file" : "/var/lib/pgsql/10/data/postgresql.conf", "host" : "pg-standby", "option" : { "hot_standby" : { "value" : "on" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "archive_command = 'pgbackrest --stanza=demo archive-push %p'", "archive_mode = on", "hot_standby = on", "log_filename = 'postgresql.log'", "log_line_prefix = ''", "max_wal_senders = 3", "wal_level = replica" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm /var/lib/pgsql/10/data/log/postgresql.log" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-10.service" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/lib/pgsql/10/data/log/postgresql.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "started streaming WAL from primary" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 7 lines of output]", "LOG: database system is ready to accept read only connections", "LOG: restored log file \"000000080000000000000024\" from archive", "LOG: started streaming WAL from primary at 0/25000000 on timeline 8" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " begin; \\", " create table stream_table (message text); \\", " insert into stream_table values ('Important Data'); \\", " commit; \\", " select *, current_timestamp from stream_table\";" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message | current_timestamp ", "----------------+-------------------------------", " Important Data | 2021-12-31 19:48:26.841094+00", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " select *, current_timestamp from stream_table\"" ], "highlight" : { "filter" : false, "filter-context" : 2, "list" : [ "Important Data" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " message | current_timestamp ", "----------------+-------------------------------", " Important Data | 2021-12-31 19:48:28.277963+00", "(1 row)" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p -m 750 /var/spool/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /var/spool/pgbackrest" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo mkdir -p -m 750 /var/spool/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo chown postgres:postgres /var/spool/pgbackrest" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "global" : { "archive-async" : { "value" : "y" }, "spool-path" : { "value" : "/var/spool/pgbackrest" } }, "global:archive-get" : { "process-max" : { "value" : "2" } }, "global:archive-push" : { "process-max" : { "value" : "2" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "repo1-host-cert-file=/etc/pgbackrest/cert/client.crt", "repo1-host-key-file=/etc/pgbackrest/cert/client.key", "repo1-host-type=tls", "", "[global]", "archive-async=y", "log-level-file=detail", "repo1-host=repository", "spool-path=/var/spool/pgbackrest", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key", "", "[global:archive-get]", "process-max=2", "", "[global:archive-push]", "process-max=2" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-standby", "option" : { "global" : { "archive-async" : { "value" : "y" }, "spool-path" : { "value" : "/var/spool/pgbackrest" } }, "global:archive-get" : { "process-max" : { "value" : "2" } }, "global:archive-push" : { "process-max" : { "value" : "2" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/10/data", "recovery-option=primary_conninfo=host=172.17.0.5 port=5432 user=replicator", "repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "repo1-host-cert-file=/etc/pgbackrest/cert/client.crt", "repo1-host-key-file=/etc/pgbackrest/cert/client.key", "repo1-host-type=tls", "", "[global]", "archive-async=y", "log-level-file=detail", "repo1-host=repository", "spool-path=/var/spool/pgbackrest", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key", "", "[global:archive-get]", "process-max=2", "", "[global:archive-push]", "process-max=2" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"alter user replicator password 'bogus'\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ALTER ROLE" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl restart postgresql-10.service" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres rm -f /var/log/pgbackrest/demo-archive-push-async.log" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \" \\", " select pg_create_restore_point('test async push'); select pg_switch_wal(); \\", " select pg_create_restore_point('test async push'); select pg_switch_wal(); \\", " select pg_create_restore_point('test async push'); select pg_switch_wal(); \\", " select pg_create_restore_point('test async push'); select pg_switch_wal(); \\", " select pg_create_restore_point('test async push'); select pg_switch_wal();\"" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --log-level-console=info check" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "WAL segment" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: check command begin 2.37: --exec-id=4666-24f1e367 --log-level-console=info --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --repo1-host=repository --repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt --repo1-host-cert-file=/etc/pgbackrest/cert/client.crt --repo1-host-key-file=/etc/pgbackrest/cert/client.key --repo1-host-type=tls --stanza=demo", "P00 INFO: check repo1 configuration (primary)", "P00 INFO: check repo1 archive for WAL (primary)", "P00 INFO: WAL segment 00000008000000000000002A successfully archived to '/var/lib/pgbackrest/archive/demo/10-1/0000000800000000/00000008000000000000002A-91572e10bbcc68d082fb476a0b8492bb395f500f.gz' on repo1", "P00 INFO: check command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/log/pgbackrest/demo-archive-push-async.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ " WAL file\\(s\\) to archive|pushed WAL file \\'0000000" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "-------------------PROCESS START-------------------", "P00 INFO: archive-push:async command begin 2.37: [/var/lib/pgsql/10/data/pg_wal] --archive-async --exec-id=4638-a7372d44 --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --process-max=2 --repo1-host=repository --repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt --repo1-host-cert-file=/etc/pgbackrest/cert/client.crt --repo1-host-key-file=/etc/pgbackrest/cert/client.key --repo1-host-type=tls --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: push 1 WAL file(s) to archive: 000000080000000000000025", "P01 DETAIL: pushed WAL file '000000080000000000000025' to the archive", "P00 DETAIL: statistics: {\"socket.client\":{\"total\":1},\"socket.session\":{\"total\":1},\"tls.client\":{\"total\":1},\"tls.session\":{\"total\":1}}", "P00 INFO: archive-push:async command end: completed successfully", "", "-------------------PROCESS START-------------------", "P00 INFO: archive-push:async command begin 2.37: [/var/lib/pgsql/10/data/pg_wal] --archive-async --exec-id=4668-f9468f12 --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --process-max=2 --repo1-host=repository --repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt --repo1-host-cert-file=/etc/pgbackrest/cert/client.crt --repo1-host-key-file=/etc/pgbackrest/cert/client.key --repo1-host-type=tls --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: push 4 WAL file(s) to archive: 000000080000000000000026...000000080000000000000029", "P01 DETAIL: pushed WAL file '000000080000000000000026' to the archive", "P02 DETAIL: pushed WAL file '000000080000000000000027' to the archive", "P01 DETAIL: pushed WAL file '000000080000000000000028' to the archive", "P02 DETAIL: pushed WAL file '000000080000000000000029' to the archive", "P00 DETAIL: statistics: {\"socket.client\":{\"total\":1},\"socket.session\":{\"total\":1},\"tls.client\":{\"total\":1},\"tls.session\":{\"total\":1}}", "P00 INFO: archive-push:async command end: completed successfully", "", "-------------------PROCESS START-------------------", "P00 INFO: archive-push:async command begin 2.37: [/var/lib/pgsql/10/data/pg_wal] --archive-async --exec-id=4676-d80866d0 --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --process-max=2 --repo1-host=repository --repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt --repo1-host-cert-file=/etc/pgbackrest/cert/client.crt --repo1-host-key-file=/etc/pgbackrest/cert/client.key --repo1-host-type=tls --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: push 1 WAL file(s) to archive: 00000008000000000000002A", "P01 DETAIL: pushed WAL file '00000008000000000000002A' to the archive", "P00 DETAIL: statistics: {\"socket.client\":{\"total\":1},\"socket.session\":{\"total\":1},\"tls.client\":{\"total\":1},\"tls.session\":{\"total\":1}}", "P00 INFO: archive-push:async command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 5" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres cat /var/log/pgbackrest/demo-archive-get-async.log" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "found [0-F]{24} in the .* archive" ] }, "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "-------------------PROCESS START-------------------", "P00 INFO: archive-get:async command begin 2.37: [000000080000000000000022, 000000080000000000000023, 000000080000000000000024, 000000080000000000000025, 000000080000000000000026, 000000080000000000000027, 000000080000000000000028, 000000080000000000000029] --archive-async --exec-id=1485-81fe8c6f --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --process-max=2 --repo1-host=repository --repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt --repo1-host-cert-file=/etc/pgbackrest/cert/client.crt --repo1-host-key-file=/etc/pgbackrest/cert/client.key --repo1-host-type=tls --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: get 8 WAL file(s) from archive: 000000080000000000000022...000000080000000000000029", "P01 DETAIL: found 000000080000000000000022 in the repo1: 10-1 archive", "P02 DETAIL: found 000000080000000000000023 in the repo1: 10-1 archive", "P01 DETAIL: found 000000080000000000000024 in the repo1: 10-1 archive", "P00 DETAIL: unable to find 000000080000000000000025 in the archive", "P00 DETAIL: statistics: {\"socket.client\":{\"total\":1},\"socket.session\":{\"total\":1},\"tls.client\":{\"total\":1},\"tls.session\":{\"total\":1}}", " [filtered 24 lines of output]", "P00 INFO: archive-get:async command begin 2.37: [000000080000000000000025, 000000080000000000000026, 000000080000000000000027, 000000080000000000000028, 000000080000000000000029, 00000008000000000000002A, 00000008000000000000002B, 00000008000000000000002C] --archive-async --exec-id=1505-c945be79 --log-level-console=off --log-level-file=detail --log-level-stderr=off --no-log-timestamp --pg1-path=/var/lib/pgsql/10/data --process-max=2 --repo1-host=repository --repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt --repo1-host-cert-file=/etc/pgbackrest/cert/client.crt --repo1-host-key-file=/etc/pgbackrest/cert/client.key --repo1-host-type=tls --spool-path=/var/spool/pgbackrest --stanza=demo", "P00 INFO: get 8 WAL file(s) from archive: 000000080000000000000025...00000008000000000000002C", "P01 DETAIL: found 000000080000000000000025 in the repo1: 10-1 archive", "P02 DETAIL: found 000000080000000000000026 in the repo1: 10-1 archive", "P01 DETAIL: found 000000080000000000000027 in the repo1: 10-1 archive", "P02 DETAIL: found 000000080000000000000028 in the repo1: 10-1 archive", "P01 DETAIL: found 000000080000000000000029 in the repo1: 10-1 archive", "P02 DETAIL: found 00000008000000000000002A in the repo1: 10-1 archive", "P00 DETAIL: unable to find 00000008000000000000002B in the archive", "P00 DETAIL: statistics: {\"socket.client\":{\"total\":1},\"socket.session\":{\"total\":1},\"tls.client\":{\"total\":1},\"tls.session\":{\"total\":1}}", " [filtered 14 lines of output]" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres psql -c \"alter user replicator password 'jw8s0F4'\"" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "ALTER ROLE" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "demo" : { "pg2-host" : { "value" : "pg-standby" }, "pg2-host-ca-file" : { "value" : "/etc/pgbackrest/cert/ca.crt" }, "pg2-host-cert-file" : { "value" : "/etc/pgbackrest/cert/client.crt" }, "pg2-host-key-file" : { "value" : "/etc/pgbackrest/cert/client.key" }, "pg2-host-type" : { "value" : "tls" }, "pg2-path" : { "value" : "/var/lib/pgsql/10/data" } }, "global" : { "backup-standby" : { "value" : "y" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-host=pg-primary", "pg1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "pg1-host-cert-file=/etc/pgbackrest/cert/client.crt", "pg1-host-key-file=/etc/pgbackrest/cert/client.key", "pg1-host-type=tls", "pg1-path=/var/lib/pgsql/10/data", "pg2-host=pg-standby", "pg2-host-ca-file=/etc/pgbackrest/cert/ca.crt", "pg2-host-cert-file=/etc/pgbackrest/cert/client.crt", "pg2-host-key-file=/etc/pgbackrest/cert/client.key", "pg2-host-type=tls", "pg2-path=/var/lib/pgsql/10/data", "", "[global]", "backup-standby=y", "process-max=3", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo --log-level-console=detail backup" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "backup file pg-primary|replay on the standby" ] }, "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 2 lines of output]", "P00 INFO: execute non-exclusive pg_start_backup(): backup begins after the requested immediate checkpoint completes", "P00 INFO: backup start archive = 00000008000000000000002C, lsn = 0/2C000028", "P00 INFO: wait for replay on the standby to reach 0/2C000028", "P00 INFO: replay on the standby reached 0/2C000028", "P00 INFO: check archive for prior segment 00000008000000000000002B", "P01 DETAIL: backup file pg-primary:/var/lib/pgsql/10/data/global/pg_control (8KB, 0%) checksum e796914fe27be239cc661ddb5b86a893ffcd1473", "P02 DETAIL: backup file pg-standby:/var/lib/pgsql/10/data/base/13017/2608 (440KB, 19%) checksum e717affd3b3f4a27eaf400f668c98108998f5a0f", "P03 DETAIL: backup file pg-standby:/var/lib/pgsql/10/data/base/13017/1249 (392KB, 36%) checksum 42cb2ebdf4a3d5bb7a20ef360a2ce7ed9f96068c", "P04 DETAIL: backup file pg-standby:/var/lib/pgsql/10/data/base/13017/2674 (368KB, 52%) checksum 43f17b5d5a0f8fad15d1acf09b0df35cb269cd9f", "P01 DETAIL: backup file pg-primary:/var/lib/pgsql/10/data/log/postgresql.log (6KB, 52%) checksum daedeede97741d6a3011aaadca7f0f8db81f0d15", "P01 DETAIL: backup file pg-primary:/var/lib/pgsql/10/data/pg_hba.conf (4.2KB, 52%) checksum 12abee43e7eabfb3ff6239f3fc9bc3598293557d", "P01 DETAIL: backup file pg-primary:/var/lib/pgsql/10/data/current_logfiles (26B, 52%) checksum 78a9f5c10960f0d91fcd313937469824861795a2", "P01 DETAIL: backup file pg-primary:/var/lib/pgsql/10/data/pg_logical/replorigin_checkpoint (8B, 52%) checksum 347fc8f2df71bd4436e38bd1516ccd7ea0d46532", "P02 DETAIL: backup file pg-standby:/var/lib/pgsql/10/data/base/13017/2673 (328KB, 66%) checksum 0835d0b60054ed006e0d61f016d235f60a8a695f", "P03 DETAIL: backup file pg-standby:/var/lib/pgsql/10/data/base/13017/2658 (112KB, 71%) checksum d56cd71531fe675d759c3d6ad4a6b3ab20c5bc87", " [filtered 1253 lines of output]" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl stop postgresql-10.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl stop postgresql-10.service" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres /usr/pgsql-11/bin/initdb \\", " -D /var/lib/pgsql/11/data -k -A peer" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sh -c 'cd /var/lib/pgsql && \\", " /usr/pgsql-11/bin/pg_upgrade \\", " --old-bindir=/usr/pgsql-10/bin \\", " --new-bindir=/usr/pgsql-11/bin \\", " --old-datadir=/var/lib/pgsql/10/data \\", " --new-datadir=/var/lib/pgsql/11/data \\", " --old-options=\" -c config_file=/var/lib/pgsql/10/data/postgresql.conf\" \\", " --new-options=\" -c config_file=/var/lib/pgsql/11/data/postgresql.conf\"'" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "Upgrade Complete" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ " [filtered 72 lines of output]", "Checking for extension updates ok", "", "Upgrade Complete", "----------------", "Optimizer statistics are not transferred by pg_upgrade so,", " [filtered 4 lines of output]" ] } }, { "key" : { "file" : "/var/lib/pgsql/11/data/postgresql.conf", "host" : "pg-primary", "option" : { "archive_command" : { "value" : "'pgbackrest --stanza=demo archive-push %p'" }, "archive_mode" : { "value" : "on" }, "listen_addresses" : { "value" : "'*'" }, "log_line_prefix" : { "value" : "''" }, "max_wal_senders" : { "value" : "3" }, "port" : { "value" : "5432" }, "wal_level" : { "value" : "replica" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "archive_command = 'pgbackrest --stanza=demo archive-push %p'", "archive_mode = on", "listen_addresses = '*'", "log_line_prefix = ''", "max_wal_senders = 3", "port = 5432", "wal_level = replica" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-primary", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/pgsql/11/data" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/11/data", "repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "repo1-host-cert-file=/etc/pgbackrest/cert/client.crt", "repo1-host-key-file=/etc/pgbackrest/cert/client.key", "repo1-host-type=tls", "", "[global]", "archive-async=y", "log-level-file=detail", "repo1-host=repository", "spool-path=/var/spool/pgbackrest", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key", "", "[global:archive-get]", "process-max=2", "", "[global:archive-push]", "process-max=2" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "pg-standby", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/pgsql/11/data" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-path=/var/lib/pgsql/11/data", "recovery-option=primary_conninfo=host=172.17.0.5 port=5432 user=replicator", "repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "repo1-host-cert-file=/etc/pgbackrest/cert/client.crt", "repo1-host-key-file=/etc/pgbackrest/cert/client.key", "repo1-host-type=tls", "", "[global]", "archive-async=y", "log-level-file=detail", "repo1-host=repository", "spool-path=/var/spool/pgbackrest", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key", "", "[global:archive-get]", "process-max=2", "", "[global:archive-push]", "process-max=2" ] } }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "demo" : { "pg1-path" : { "value" : "/var/lib/pgsql/11/data" }, "pg2-path" : { "value" : "/var/lib/pgsql/11/data" } }, "global" : { "backup-standby" : { "value" : "n" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-host=pg-primary", "pg1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "pg1-host-cert-file=/etc/pgbackrest/cert/client.crt", "pg1-host-key-file=/etc/pgbackrest/cert/client.key", "pg1-host-type=tls", "pg1-path=/var/lib/pgsql/11/data", "pg2-host=pg-standby", "pg2-host-ca-file=/etc/pgbackrest/cert/ca.crt", "pg2-host-cert-file=/etc/pgbackrest/cert/client.crt", "pg2-host-key-file=/etc/pgbackrest/cert/client.key", "pg2-host-type=tls", "pg2-path=/var/lib/pgsql/11/data", "", "[global]", "backup-standby=n", "process-max=3", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo cp /var/lib/pgsql/10/data/pg_hba.conf \\", " /var/lib/pgsql/11/data/pg_hba.conf" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --no-online \\", " --log-level-console=info stanza-upgrade" ], "highlight" : { "filter" : true, "filter-context" : 2, "list" : [ "completed successfully" ] }, "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 INFO: stanza-upgrade command begin 2.37: --exec-id=5116-27923273 --log-level-console=info --log-level-file=detail --log-level-stderr=off --no-log-timestamp --no-online --pg1-path=/var/lib/pgsql/11/data --repo1-host=repository --repo1-host-ca-file=/etc/pgbackrest/cert/ca.crt --repo1-host-cert-file=/etc/pgbackrest/cert/client.crt --repo1-host-key-file=/etc/pgbackrest/cert/client.key --repo1-host-type=tls --stanza=demo", "P00 INFO: stanza-upgrade for stanza 'demo' on repo1", "P00 INFO: stanza-upgrade command end: completed successfully" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-11.service" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres systemctl status postgresql-11.service" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo check" ], "host" : "pg-primary", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm -rf /var/lib/pgsql/10/data" ], "host" : "pg-primary", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo rm -rf /var/lib/pgsql/10/data" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres mkdir -p -m 700 /usr/pgsql-11/bin" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo check" ], "host" : "repository", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe", "value" : { "output" : [ "P00 WARN: unable to check pg-2: [DbConnectError] raised from remote-0 tls protocol on 'pg-standby': unable to connect to 'dbname='postgres' port=5432': could not connect to server: No such file or directory", " \tIs the server running locally and accepting", " \tconnections on Unix domain socket \"/var/run/postgresql/.s.PGSQL.5432\"?" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u pgbackrest pgbackrest --stanza=demo --type=full backup" ], "host" : "repository", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo --type=standby restore" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/var/lib/pgsql/11/data/postgresql.conf", "host" : "pg-standby", "option" : { "hot_standby" : { "value" : "on" } } }, "type" : "cfg-postgresql", "value" : { "config" : [ "hot_standby = on" ] } }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo systemctl start postgresql-11.service" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres sleep 2" ], "host" : "pg-standby", "load-env" : true, "output" : false, "run-as-user" : null }, "type" : "exe" }, { "key" : { "bash-wrap" : true, "cmd" : [ "sudo -u postgres pgbackrest --stanza=demo check" ], "host" : "pg-standby", "load-env" : true, "output" : true, "run-as-user" : null }, "type" : "exe" }, { "key" : { "file" : "/etc/pgbackrest/pgbackrest.conf", "host" : "repository", "option" : { "global" : { "backup-standby" : { "value" : "y" } } } }, "type" : "cfg-pgbackrest", "value" : { "config" : [ "[demo]", "pg1-host=pg-primary", "pg1-host-ca-file=/etc/pgbackrest/cert/ca.crt", "pg1-host-cert-file=/etc/pgbackrest/cert/client.crt", "pg1-host-key-file=/etc/pgbackrest/cert/client.key", "pg1-host-type=tls", "pg1-path=/var/lib/pgsql/11/data", "pg2-host=pg-standby", "pg2-host-ca-file=/etc/pgbackrest/cert/ca.crt", "pg2-host-cert-file=/etc/pgbackrest/cert/client.crt", "pg2-host-key-file=/etc/pgbackrest/cert/client.key", "pg2-host-type=tls", "pg2-path=/var/lib/pgsql/11/data", "", "[global]", "backup-standby=y", "process-max=3", "repo1-path=/var/lib/pgbackrest", "repo1-retention-full=2", "start-fast=y", "tls-server-address=*", "tls-server-auth=pgbackrest-client=demo", "tls-server-ca-file=/etc/pgbackrest/cert/ca.crt", "tls-server-cert-file=/etc/pgbackrest/cert/server.crt", "tls-server-key-file=/etc/pgbackrest/cert/server.key" ] } } ] } } } pgbackrest-release-2.37/doc/resource/fake-cert/000077500000000000000000000000001416457663300214645ustar00rootroot00000000000000pgbackrest-release-2.37/doc/resource/fake-cert/.gitignore000066400000000000000000000000141416457663300234470ustar00rootroot00000000000000*.csr *.srl pgbackrest-release-2.37/doc/resource/fake-cert/README.md000066400000000000000000000030601416457663300227420ustar00rootroot00000000000000# pgBackRest Documentation Certificates The certificates in this directory are used for documentation generation only and should not be used for actual services. ## pgBackRest CA Generate a CA that will be used to sign documentation certificates. It can be installed in the documentation containers to make certificates signed by it valid. ``` cd [pgbackrest-root]/doc/resource/fake-cert openssl ecparam -genkey -name prime256v1 | openssl ec -out ca.key openssl req -new -x509 -extensions v3_ca -key ca.key -out ca.crt -days 99999 \ -subj "/C=US/ST=All/L=All/O=pgBackRest/CN=pgbackrest.org" ``` ## S3 Certificate Mimic an S3 certificate for the `us-east-1`/`us-east-2` region to generate S3 documentation. ``` cd [pgbackrest-root]/doc/resource/fake-cert openssl ecparam -genkey -name prime256v1 | openssl ec -out s3-server.key openssl req -new -sha256 -nodes -out s3-server.csr -key s3-server.key -config s3.cnf openssl x509 -req -in s3-server.csr -CA ca.crt -CAkey ca.key -CAcreateserial \ -out s3-server.crt -days 99999 -extensions v3_req -extfile s3.cnf ``` ## Azure Certificate Mimic an Azure certificate for the `*.blob.core.windows.net` hosts to generate Azure documentation. ``` cd [pgbackrest-root]/doc/resource/fake-cert openssl ecparam -genkey -name prime256v1 | openssl ec -out azure-server.key openssl req -new -sha256 -nodes -out azure-server.csr -key azure-server.key -config azure.cnf openssl x509 -req -in azure-server.csr -CA ca.crt -CAkey ca.key -CAcreateserial \ -out azure-server.crt -days 99999 -extensions v3_req -extfile azure.cnf ``` pgbackrest-release-2.37/doc/resource/fake-cert/azure-server.crt000066400000000000000000000014461416457663300246350ustar00rootroot00000000000000-----BEGIN CERTIFICATE----- MIICJjCCAc2gAwIBAgIUdW+DRN7XbILssJmdxycMz90EEwUwCgYIKoZIzj0EAwIw VzELMAkGA1UEBhMCVVMxDDAKBgNVBAgMA0FsbDEMMAoGA1UEBwwDQWxsMRMwEQYD VQQKDApwZ0JhY2tSZXN0MRcwFQYDVQQDDA5wZ2JhY2tyZXN0Lm9yZzAgFw0yMDA2 MjkxOTM0MjhaGA8yMjk0MDQxMzE5MzQyOFowdzELMAkGA1UEBhMCVVMxDDAKBgNV BAgMA0FsbDEMMAoGA1UEBwwDQWxsMRMwEQYDVQQKDApwZ0JhY2tSZXN0MRwwGgYD VQQLDBNVbml0IFRlc3RpbmcgRG9tYWluMRkwFwYDVQQDDBBjb3JlLndpbmRvd3Mu bmV0MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEqQy14z/cTAwvIDUCgU+5ATJh 5hsvMaUrYfuCEFC9tx7+zeqrEbtWOqO1dQVnCfZr38lwrTDzJvZJKqh4rTlWoKNV MFMwCQYDVR0TBAIwADALBgNVHQ8EBAMCBeAwOQYDVR0RBDIwMIIVYmxvYi5jb3Jl LndpbmRvd3MubmV0ghcqLmJsb2IuY29yZS53aW5kb3dzLm5ldDAKBggqhkjOPQQD AgNHADBEAiB5RbKWvkzISbAHRqkg4egKcitsijqZsPJgpj4X91ercwIgBJmMNKVP ELrECSmLFbJQCIZJAMcbzmLxZNcnsRaMUG8= -----END CERTIFICATE----- pgbackrest-release-2.37/doc/resource/fake-cert/azure-server.key000066400000000000000000000003431416457663300246300ustar00rootroot00000000000000-----BEGIN EC PRIVATE KEY----- MHcCAQEEIEGn3zrwzQ8+ZP6i+eye3iqQybiBK4ap+JAQ0uNGEMP1oAoGCCqGSM49 AwEHoUQDQgAEqQy14z/cTAwvIDUCgU+5ATJh5hsvMaUrYfuCEFC9tx7+zeqrEbtW OqO1dQVnCfZr38lwrTDzJvZJKqh4rTlWoA== -----END EC PRIVATE KEY----- pgbackrest-release-2.37/doc/resource/fake-cert/azure.cnf000066400000000000000000000006141416457663300233030ustar00rootroot00000000000000[req] default_bits = 4096 prompt = no default_md = sha256 req_extensions = v3_req distinguished_name = dn [ dn ] C=US ST=All L=All O=pgBackRest OU=Unit Testing Domain CN = core.windows.net [ v3_req ] basicConstraints = CA:FALSE keyUsage = nonRepudiation, digitalSignature, keyEncipherment subjectAltName = @alt_names [ alt_names ] DNS.1 = blob.core.windows.net DNS.2 = *.blob.core.windows.net pgbackrest-release-2.37/doc/resource/fake-cert/ca.crt000066400000000000000000000013511416457663300225610ustar00rootroot00000000000000-----BEGIN CERTIFICATE----- MIIB+jCCAaCgAwIBAgIJAJDUUhiBUbmEMAoGCCqGSM49BAMCMFcxCzAJBgNVBAYT AlVTMQwwCgYDVQQIDANBbGwxDDAKBgNVBAcMA0FsbDETMBEGA1UECgwKcGdCYWNr UmVzdDEXMBUGA1UEAwwOcGdiYWNrcmVzdC5vcmcwIBcNMTkwNTI3MDAxOTU5WhgP MjI5MzAzMTAwMDE5NTlaMFcxCzAJBgNVBAYTAlVTMQwwCgYDVQQIDANBbGwxDDAK BgNVBAcMA0FsbDETMBEGA1UECgwKcGdCYWNrUmVzdDEXMBUGA1UEAwwOcGdiYWNr cmVzdC5vcmcwWTATBgcqhkjOPQIBBggqhkjOPQMBBwNCAAQYHUcSknRDL+fgFJZI IC73Ju75yA0203IxPO35i8mVb9CcWVhEgHmS+cQ6SfY6GC7V61VB7gwzQ+XESi2p ndhJo1MwUTAdBgNVHQ4EFgQUYMbKIlTUE6gklw8KcSC6fnlOitwwHwYDVR0jBBgw FoAUYMbKIlTUE6gklw8KcSC6fnlOitwwDwYDVR0TAQH/BAUwAwEB/zAKBggqhkjO PQQDAgNIADBFAiEA1Bzy17/6jQimg3ROZTrVGkRtAuzTtjgDParHFrIhSDoCIH43 OeOUaPVb0rXGPLu9rFpjPOmtFSW3lf4skheJMKyN -----END CERTIFICATE----- pgbackrest-release-2.37/doc/resource/fake-cert/ca.key000066400000000000000000000003431416457663300225610ustar00rootroot00000000000000-----BEGIN EC PRIVATE KEY----- MHcCAQEEIB5f3SxfiZ92GMpuqpfTiPO3xaVOnxRh6qVAoRtu7NOZoAoGCCqGSM49 AwEHoUQDQgAEGB1HEpJ0Qy/n4BSWSCAu9ybu+cgNNtNyMTzt+YvJlW/QnFlYRIB5 kvnEOkn2Ohgu1etVQe4MM0PlxEotqZ3YSQ== -----END EC PRIVATE KEY----- pgbackrest-release-2.37/doc/resource/fake-cert/s3-server.crt000066400000000000000000000016101416457663300240250ustar00rootroot00000000000000-----BEGIN CERTIFICATE----- MIICbTCCAhOgAwIBAgIJAODTXyGnxWtVMAoGCCqGSM49BAMCMFcxCzAJBgNVBAYT AlVTMQwwCgYDVQQIDANBbGwxDDAKBgNVBAcMA0FsbDETMBEGA1UECgwKcGdCYWNr UmVzdDEXMBUGA1UEAwwOcGdiYWNrcmVzdC5vcmcwIBcNMTkwNTI3MDIwODEwWhgP MjI5MzAzMTAwMjA4MTBaMIGBMQswCQYDVQQGEwJVUzEMMAoGA1UECAwDQWxsMQww CgYDVQQHDANBbGwxEzARBgNVBAoMCnBnQmFja1Jlc3QxHDAaBgNVBAsME1VuaXQg VGVzdGluZyBEb21haW4xIzAhBgNVBAMMGnMzLnVzLWVhc3QtMS5hbWF6b25hd3Mu Y29tMFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEEe2dO1v1gE0Qj4H407i0K8tN kASkveckACPFzXs2i/++rZY4bwUub08JcMRv0WWwnRzOoumsN26Ge454vTbjoqOB mjCBlzAJBgNVHRMEAjAAMAsGA1UdDwQEAwIF4DB9BgNVHREEdjB0ghpzMy51cy1l YXN0LTEuYW1hem9uYXdzLmNvbYIcKi5zMy51cy1lYXN0LTEuYW1hem9uYXdzLmNv bYIaczMudXMtZWFzdC0yLmFtYXpvbmF3cy5jb22CHCouczMudXMtZWFzdC0yLmFt YXpvbmF3cy5jb20wCgYIKoZIzj0EAwIDSAAwRQIgLiE7LuK6O/bKo70XPUi6xoDE ew+EHO31klTOeWiS6oMCIQCHMEqSAcDF/gnG/UXnp2viHOFjnY+NZgQo76l+/2mE iQ== -----END CERTIFICATE----- pgbackrest-release-2.37/doc/resource/fake-cert/s3-server.key000066400000000000000000000003431416457663300240270ustar00rootroot00000000000000-----BEGIN EC PRIVATE KEY----- MHcCAQEEIBhweMaCuhrRJy6hLV9X7QRCorDdyiUvSWEySHXZJM4DoAoGCCqGSM49 AwEHoUQDQgAEEe2dO1v1gE0Qj4H407i0K8tNkASkveckACPFzXs2i/++rZY4bwUu b08JcMRv0WWwnRzOoumsN26Ge454vTbjog== -----END EC PRIVATE KEY----- pgbackrest-release-2.37/doc/resource/fake-cert/s3.cnf000066400000000000000000000007501416457663300225030ustar00rootroot00000000000000[req] default_bits = 4096 prompt = no default_md = sha256 req_extensions = v3_req distinguished_name = dn [ dn ] C=US ST=All L=All O=pgBackRest OU=Unit Testing Domain CN = s3.us-east-1.amazonaws.com [ v3_req ] basicConstraints = CA:FALSE keyUsage = nonRepudiation, digitalSignature, keyEncipherment subjectAltName = @alt_names [ alt_names ] DNS.1 = s3.us-east-1.amazonaws.com DNS.2 = *.s3.us-east-1.amazonaws.com DNS.3 = s3.us-east-2.amazonaws.com DNS.4 = *.s3.us-east-2.amazonaws.com pgbackrest-release-2.37/doc/resource/git-history.cache000066400000000000000000044602511416457663300231060ustar00rootroot00000000000000[ { "commit": "62fbee72ad319f92c9410ac8dbab2f81fe945a36", "date": "2022-01-01 10:50:16 -0500", "subject": "Update LICENSE.txt and PostgreSQL copyright for 2022." }, { "commit": "d6ebf6e2d67331a838f53beda1c186c527b56a8e", "date": "2021-12-30 18:54:36 -0500", "subject": "Remove dead test code." }, { "commit": "fccb7f7dd45c6c373d0cfa74b90d69ca483aa3af", "date": "2021-12-28 17:39:22 -0500", "subject": "Add release note regarding IANA approval of the default TLS port." }, { "commit": "6a12458440168f13cb05d70f36ea54b1860e390c", "date": "2021-12-16 10:30:59 -0500", "subject": "Parse protocol/port in S3/Azure endpoints.", "body": "Utilize httpUrlNewParseP() to parse endpoint and port from the URL in the S3 and Azure helpers to avoid issues where protocol was not expected to be part of the URL." }, { "commit": "f06101de77a980c7e4115762f2fc301280aa4127", "date": "2021-12-16 09:47:04 -0500", "subject": "Add TLS server documentation.", "body": "Add documentation and make the feature visible." }, { "commit": "615bdff4030a31bfedfe7df04676e3948ec9c2c0", "date": "2021-12-14 14:53:41 -0500", "subject": "Fix socket leak on connection retries.", "body": "This leak was caused by the file descriptor variable getting clobbered after a long jump. Mark it as volatile to fix.\r\n\r\nTesting this is a bit complex because the issue only happens in optimized builds, if at all. Put the test into the performance suite, which is always optimized, until a better idea presents itself." }, { "commit": "a73fe4eb966f9685f6e4179c397a10c1e7f15f19", "date": "2021-12-10 15:53:40 -0500", "subject": "Fix restore delta link mapping when path/file already exists.", "body": "If a path/file was remapped to a link using either --link-map or --link-all there would be no affect if the path/file already existed. If a link existed it would be properly updated and converting a link to a path/file also worked.\r\n\r\nThe issue happened during delta cleanup, which failed to check if the existing path/file had been remapped to a link.\r\n\r\nAdd checks for newly mapped path/file links and remove the old path/file we required." }, { "commit": "19a7ec69debfe6587fcc1163451896590c96bf21", "date": "2021-12-10 15:04:55 -0500", "subject": "Close expect log file when unit test completes.", "body": "This did not cause any issues, but it is better to explicitly close open files." }, { "commit": "c38e2d31709804eb4b9125a15ad84c8fc813f366", "date": "2021-12-08 15:00:19 -0500", "subject": "Add verb to HTTP error output.", "body": "This makes it easier to debug HTTP errors." }, { "commit": "be4ac3923cb77873da298a30aca5d847b3c635af", "date": "2021-12-08 13:57:26 -0500", "subject": "Error when restore is unable to find a backup to match the time target.", "body": "This was previously a warning but the warning is easy to miss so a lot of time may be lost restoring and recovering a backup that will not hit the target.\r\n\r\nSince this is technically a breaking change, add an \"important note\" about the change to the release." }, { "commit": "672330593789f07aaad90bbafcd2597cbc602686", "date": "2021-12-08 12:29:20 -0500", "subject": "Add warning when checkpoint_timeout exceeds db-timeout.", "body": "In the backup command, add a warning if start-fast is disabled and the PostgreSQL checkpoint_timeout is greater than db-timeout.\r\n\r\nIn such cases, we might timeout before the checkpoint occurs and the backup really starts." }, { "commit": "bd2ba802db11c505ec69943fa81b2b379073fbf4", "date": "2021-12-08 10:16:41 -0500", "subject": "Check that clusters are alive and correctly configured during a backup.", "body": "Fail the backup if a cluster stops or the standby is promoted. Previously, shutting down the primary would cause an error but it was not detected until the end of the backup. Now the error will happen sooner and a promotion on the standby will also cause an error." }, { "commit": "7b3ea883c7c010aafbeb14d150d073a113b703e4", "date": "2021-12-07 18:18:43 -0500", "subject": "Add SIGTERM and SIGHUP handling to TLS server.", "body": "SIGHUP allows the configuration to be reloaded. Note that the configuration will not be updated in child processes that have already started.\r\n\r\nSIGTERM terminates the server process gracefully and sends SIGTERM to all child processes. This also gives the tests an easy way to stop the server." }, { "commit": "49145d72bac16498cdbf5eeb3cd6128ea0be0667", "date": "2021-12-07 09:21:07 -0500", "subject": "Add timeline and checkpoint checks to backup.", "body": "Add the following checks:\r\n\r\n* Checkpoint is updated in pg_control after pg_start_backup(). This helps ensure that PostgreSQL and pgBackRest have a consistent view of the storage and that PGDATA paths match.\r\n* Timeline of backup start WAL file matches pg_control. Hard to see how this one could get hit, but we have the power...\r\n* Standby is on the same timeline as the primary. If not, this standby is not following the primary.\r\n* Last standby checkpoint is not greater than the backup checkpoint. If so, this standby is not following the primary.\r\n\r\nThis also requires some additional plumbing to read/write timeline/checkpoint from pg_control and parse timelines from WAL filenames. There were some changes in the backup tests caused by the fact that pg_control now has different contents for each backup.\r\n\r\nThe check to ensure that the required checkpoint was reached on the standby should also be updated to use pg_control (it currently uses pg_control_checkpoint()), but that requires non-trivial changes to the test harness and will need to wait." }, { "commit": "9c76056dd0d1d2b07a89646b087c5c8d36ab97f5", "date": "2021-11-30 16:21:15 -0500", "subject": "Add error type and message to CHECK() macro.", "body": "A CHECK() worked exactly like ASSERT() except that it was compiled into production code. However, over time many checks have been added that should not throw AssertError, which should be reserved for probable coding errors.\n\nAllow the error code to be specified so other error types can be thrown. Also add a human-readable message since many of these could be seen by users even when there is no coding error.\n\nUpdate coverage exceptions for CHECK() to match ASSERT() since all conditions will never be covered." }, { "commit": "0895cfcdf7d3f15b8029f73ed62c6094d30724b3", "date": "2021-11-30 13:23:11 -0500", "subject": "Add HRN_PG_CONTROL_PUT() and HRN_PG_CONTROL_TIME().", "body": "These macros simplify management of pg_control test files.\n\nCentralize time updates for pg_control in the command/backup module. This caused some time updates in the logs.\n\nFinally, move the postgres module after the storage module so it can use storage macros." }, { "commit": "01ac6b6cac86ea857e54a3b1c45077df1e128a75", "date": "2021-11-30 08:28:36 -0500", "subject": "Autogenerate test system identifiers.", "body": "hrnPgControlToBuffer() and hrnPgWalToBuffer() now generate the system id based on the version of Postgres. If a value less than 100 is specified for systemId then it will be added to the default system id so there can be multiple ids for a single version of PostgreSQL.\n\nAdd constants to represent version system ids in tests. These will eventually be auto-generated.\n\nThis changes some checksums and we no longer have big-endian tests systems, so X those checksums out so it is obvious they are no longer valid." }, { "commit": "3f7409019df112ec50efb6c3db6f7780c9a63c87", "date": "2021-11-24 16:09:45 -0500", "subject": "Ensure ASSERT() macro is always available in test modules.", "body": "Tests that run without DEBUG for performance did not have ASSERT() and were using CHECK() instead.\n\nInstead ensure that the ASSERT() macro is always available in tests." }, { "commit": "dcb4f09d8315e92c0877b589f3fa9b7f0fa65f93", "date": "2021-11-23 09:37:12 -0500", "subject": "Revert changes to backupFilePut() made in 1e77fc3d.", "body": "These changes were made obsolete by a3d7a23a." }, { "commit": "7e35245dc3416238a84a43abbecdf976170dea91", "date": "2021-11-23 08:07:31 -0500", "subject": "Use ASSERT() or TEST_RESULT*() instead of CHECK() in test modules." }, { "commit": "a3d7a23a9d90611a3d31947598fbea240b250710", "date": "2021-11-22 12:52:37 -0500", "subject": "Use infoBackupDataByLabel() to log backup size.", "body": "Eliminate summing and passing of copied files sizes for logging backup size.\r\n\r\nInstead, utilize infoBackupDataByLabel() to pull the backup size for the log message." }, { "commit": "1a0560d363d28737befb8c222647783d4fc2ca29", "date": "2021-11-19 12:22:09 -0500", "subject": "Allow y/n arguments for boolean command-line options.", "body": "This allows boolean boolean command-line options to work like their config file equivalents.\r\n\r\nAt least for now this behavior will remain undocumented since all examples in the documentation will continue to use the standard syntax. The idea is that it will \"just work\" when options are copied out of config files rather than generating an error." }, { "commit": "2d963ce9471808172f879916c3f3accc35f14d56", "date": "2021-11-18 17:23:11 -0500", "subject": "Rename server-start command to server." }, { "commit": "1f14f45dfb0d1677a695719381cbd5a8a3c6c986", "date": "2021-11-18 16:18:10 -0500", "subject": "Check archive immediately after backup start.", "body": "Previously the archive was only checked at the end of the backup to ensure all WAL required to make the backup consistent was present. The problem was that if archiving was not functioning then the backup had to complete before the user found out, which could be a while if the database was large enough.\r\n\r\nAdd an archive check immediately after backup start so failures are reported earlier.\r\n\r\nThe trick is to determine which WAL to check. If the repo is new there may not be any WAL in it and pg_start_backup() will not switch the WAL segment if it is empty. These are both likely scenarios when setting up and/or testing pgBackRest.\r\n\r\nIf the WAL segment is switched by pg_start_backup(), then check the archive for the segment that was detected prior to backup start. This should be common on normal running clusters with regular activity. Note that this might not be the segment immediately prior to the backup start segment if WAL volume is high.\r\n\r\nIf pg_start_backup() did not switch the WAL then we can force a switch on PostgreSQL >= 9.3 by creating a restore point. In that case the WAL to check will be the backup start WAL. This is most likely to happen on idle systems, during testing, or immediately after a repo switch.\r\n\r\nAn advantage of this approach other than earlier notification is that the backup directory will not be created so no resume will be attempted on the next backup.\r\n\r\nNote that some additional churn was created in backup.c because the load of archive.info needs to be done earlier." }, { "commit": "dea752477ab8e812cdbd717eb2091baf3f5d0906", "date": "2021-11-17 16:39:04 -0500", "subject": "Remove obsolete statement about future multi-repository support." }, { "commit": "0949b4d35fdd04c55927eb6a107d881376dbe73c", "date": "2021-11-16 18:26:21 -0500", "subject": "Add linefeed and remove space." }, { "commit": "809f0bbc638cdd95540e2257383147919f82e8f9", "date": "2021-11-16 11:34:53 -0500", "subject": "Add infoBackupLabelExists().", "body": "This is easier to read than using infoBackupDataByLabel() != NULL.\n\nIt also allows an assertion to be added to infoBackupDataByLabel() to ensure that a NULL return value is not used unsafely." }, { "commit": "1e77fc3d75490b7a1b6a0b31be9298c995ec672f", "date": "2021-11-16 10:21:32 -0500", "subject": "Include backup_label and tablespace_map file sizes in log output.", "body": "In cases where they are returned by postgres, include backup_label and tablespace_map file sizes in the backup size value output in the log." }, { "commit": "6b5322cdad7163d91b43d37d9d8eeaa39ac7f214", "date": "2021-11-16 09:27:15 -0500", "subject": "Add findutils package to RHEL 8 documentation container.", "body": "This package was dropped from the most recent Rocky Linux 8 image." }, { "commit": "df89eff429e9b8fbc68d9e9895badf9719fd31d2", "date": "2021-11-15 16:53:41 -0500", "subject": "Fix typos and improve documentation for the tablespace-map-all option." }, { "commit": "fcae9d35038d454c674921c65beb02b195981480", "date": "2021-11-15 16:42:46 -0500", "subject": "Fix parameter test logging in parseOptionIdxValue()." }, { "commit": "b3a5f7a8e27768c445458e47dad626609814fbb7", "date": "2021-11-15 14:32:22 -0500", "subject": "Add tablespace_map file to command/backup test module.", "body": "The code worked fine but better to have explicit tests for this file." }, { "commit": "e62ba8e85eaf469052960c4fd71ffaf26c1a1baa", "date": "2021-11-12 17:15:45 -0500", "subject": "Add path to pgbench used for stress test in user guide.", "body": "This allows the stress test to run on RHEL." }, { "commit": "43cfa9cef776360e592882c0b787704dbeb36cb3", "date": "2021-11-10 12:14:41 -0500", "subject": "Revive archive performance test.", "body": "This test was lost due to a syntax issue in a58635ac.\n\nUpdate the test to use system() to better mimic what postgres does and add logging so pgBackRest timing can be determined." }, { "commit": "dd96c29f963609fad38dac3349d7fa41e40722bb", "date": "2021-11-10 07:53:46 -0500", "subject": "Refactor postgres/client module with inline getters/setters.", "body": "Extend the pattern introduced in 79a2d02 to the postgres/client module." }, { "commit": "afe77e76e0adf948138d797e227a6f4c7d47c2eb", "date": "2021-11-10 07:31:02 -0500", "subject": "Update contributor for 6e635764." }, { "commit": "6e635764a66278d5a8c2b4d30b23063bc3923067", "date": "2021-11-09 13:24:56 -0500", "subject": "Match backup log size with size reported by info command.", "body": "Properly log the size of files copied during the backup, matching the backup size returned from the info command.\r\n\r\nIn the reference issue, the incremental backup after switchover logs the size of all files evaluated rather than only the size of the files copied in the backup." }, { "commit": "d05d6b87142347cb4891304833db389dcf7f9a81", "date": "2021-11-08 09:39:58 -0500", "subject": "Do not delete manifests individually during stanza delete.", "body": "This appears to have been an attempt to not delete files that we don't recognize, but it only works in narrow cases and could leave the user is a position of not being able to complete the stanza delete without manual intervention. It seems better just to proceed with the delete, especially since the info files have already been removed.\n\nIn addition, deleting the manifests individually could be slow on object stores if there were a very large number of backups." }, { "commit": "bb03b3f41942d0b781931092a76877ad309001ef", "date": "2021-11-04 09:44:31 -0400", "subject": "Refactor switch statements in strIdBitFromZN().", "body": "Coverity does not like fall-throughs either to or from the default case so refactor to avoid that." }, { "commit": "676b9d95dd2467d4bddd402b5cd2b4f445c71944", "date": "2021-11-04 08:19:18 -0400", "subject": "Optional parameters for tlsClientNew().", "body": "There are a number of optional parameters with the same type so this makes them easier to track and reduces churn when new ones are added." }, { "commit": "038abaa71d816cc87b382bd81d3df62ddec9455a", "date": "2021-11-03 15:23:08 -0400", "subject": "Display size option default and allowed values with appropriate units.", "body": "Size option default and allowed values were displayed in bytes, which was confusing for the user.\r\n\r\nThis also lays the groundwork for adding units to time options.\r\n\r\nMove option parsing functions into a common module so they can be used from the build module." }, { "commit": "1b93a772369bbb3a936099e0d9d5cc79bad1e0f6", "date": "2021-11-03 12:14:17 -0400", "subject": "Use void * instead of List * to avoid Coverity false positives.", "body": "Coverity complains that this should be \"List\" but that is clearly not correct." }, { "commit": "2a576477b316238473525e56bc8fc8ea5790455f", "date": "2021-11-03 11:36:34 -0400", "subject": "Add --cmd option.", "body": "Allows users to provide an executable to be used when pgbackrest generates command strings that expect to invoke pgbackrest. These generated commands are written to files by pgbackrest, e.g. recovery.conf." }, { "commit": "c5b5b5880619d0994ab4a8feb3f60ab52170b61b", "date": "2021-11-03 10:36:31 -0400", "subject": "Simplify error handler.", "body": "The error handler used a loop to process try, catch, and finally blocks. This worked fine but static analysis tools like Coverity did not understand that the finally block would always run and so there were false positives about double-free, unfreed resource, etc.\r\n\r\nThis implementation removes the loop, which simplifies everything, and makes it clear that the finally block will always run. This cuts down on Coverity false positives.\r\n\r\nThis implementation also catches lack of coverage on empty catch blocks so a few test fixes were committed separately in d74fe7a.\r\n\r\nA small refactor in backup.c is required because gcc 10.3.1 on Fedora 33 complains that the reason variable may be used uninitialized. It's not clear why this is the case, but reducing the scope of the TRY block fixes the issue." }, { "commit": "cff961ede7e41fa8035ffe7451a22eb5ea0e46c1", "date": "2021-11-03 07:38:06 -0400", "subject": "Centralize logic to build value lists during config rendering.", "body": "This reduces duplication and makes it easier to add new types." }, { "commit": "7f6c513be925c77bc6a177408efcf79f624ffc94", "date": "2021-11-03 07:27:26 -0400", "subject": "Add StringId as an option type.", "body": "Rather the converting String to StringIds at runtime, store defaults in StringId format in parse.auto.c and convert user input to StringId during parsing." }, { "commit": "b13844086d419dc3070bcce4e918b2353bf4887c", "date": "2021-11-01 17:35:19 -0400", "subject": "Use cfgOptionStrId() instead of cfgOptionStr() where appropriate.", "body": "The compress-type, repo-type and log-level-* options have allow lists, which means it is more efficient to treat them as StringIds.\r\n\r\nFor compress-type and log-level-* also update the functions that convert them to enums." }, { "commit": "b237d0cd592bbc6c6ee9280fb7aed264bf79eb9d", "date": "2021-11-01 10:43:08 -0400", "subject": "Remove placeholder bz2 helper data.", "body": "This placeholder data should have been removed when bz2 support was added in a021c9fe053." }, { "commit": "f4e281399a81835821547ea5c78ed7a189914d3d", "date": "2021-11-01 10:27:57 -0400", "subject": "Remove unused protocol log level.", "body": "This log level was used in the Perl code but was never ported to C." }, { "commit": "bc352fa6a8cff7cc08b6c7f3cdfac664d2b0805f", "date": "2021-11-01 10:08:56 -0400", "subject": "Simplify strIdFrom*() functions.", "body": "The strIdFrom*() forced the caller to pick an encoding, which led to a number of TRY...CATCH blocks in the code. In practice the caller does not care which encoding is used as long as the string is valid for some encoding.\r\n\r\nUpdate the strIdFrom*() function to try all possible encodings and only throw an error when the string is not valid for any of them." }, { "commit": "a92d7938197d1035e362390ce467ae827cbae051", "date": "2021-11-01 09:11:43 -0400", "subject": "Update automake version.", "body": "There were no changes to install.sh in this version." }, { "commit": "904b897f5e89542784af64b364a49205e7a6e040", "date": "2021-11-01 09:03:42 -0400", "subject": "Begin v2.37 development." }, { "commit": "42fd6ce4e09ee92614cfbfb6766d9c3a6ba9cc1a", "date": "2021-11-01 08:59:14 -0400", "subject": "v2.36: Minor Bug Fixes and Improvements" }, { "commit": "6abb06248c2829f2c27a7a553d373b0fdf70cfc3", "date": "2021-10-29 11:45:50 -0400", "subject": "Make analytics optional for HTML documentation.", "body": "Analytics should only be added to the current HTML documentation on the website, so exclude them by default." }, { "commit": "13366573261bf2562bc93ef77014f7d16b815e5b", "date": "2021-10-29 10:35:56 -0400", "subject": "Restore some linefeed rendering behavior from before def7d513.", "body": "The new rendering behavior is correct in normal cases, but for the pre-rendered HTML blocks in the command and configuration references it causes a lot of churn. This would be OK if the new HTML was diff-able, but it is not.\n\nGo back to the old behavior of using br tags for this case to reduce churn until a more permanent solution is found." }, { "commit": "c32e000ab92e9e9e5495ddec7c4e347c35801570", "date": "2021-10-28 15:15:49 -0400", "subject": "Use Rocky Linux for documentation builds instead of CentOS.", "body": "Since CentOS 8 will be EOL at the end of the year it makes sense to do this now. The centos:8 image is still used in documentation.xml because changes there require manual testing, which will need to be done at a later date. The changes are not user-facing, however, and can be done at any time.\n\nAlso update CentOS references to RHEL since that is what we are emulating for testing purposes." }, { "commit": "30c589ace7a459f3b3d09b702e314efd412e71d6", "date": "2021-10-28 13:28:49 -0400", "subject": "Fix typo in contributing guide.", "body": "Not sure how this got broken but it was probably an errant search and replace." }, { "commit": "2f1a2877373c7be68d553c7f781299edaf8ff196", "date": "2021-10-28 11:49:00 -0400", "subject": "Add missing assert." }, { "commit": "adc09ffc3bccb24c83a471c8af1f9bf68f2cf9c8", "date": "2021-10-28 08:10:43 -0400", "subject": "Minor fix for lower-casing of option summaries.", "body": "This works with existing cases and fixes \"I/O\"." }, { "commit": "fa564ee1969229b5cf60d2479d8ace85325f4db3", "date": "2021-10-27 11:08:32 -0400", "subject": "Improve documentation for cmd-ssh, repo-host-cmd, pg-host-cmd options.", "body": "Use \"command\" instead of \"exe\" and make the descriptions more consistent." }, { "commit": "e1f6c066b3da11fd21b1155c90370c3fa2da06b7", "date": "2021-10-27 10:52:39 -0400", "subject": "Improve documentation for buffer-size option." }, { "commit": "1f7c7b7dda1c736fab2673084498fc7c220b742a", "date": "2021-10-26 16:56:44 -0400", "subject": "Fix test descriptions in common/typeVariantTest." }, { "commit": "d74fe7a222c1e1ae0f02addbeb712f8946d3d731", "date": "2021-10-26 13:53:44 -0400", "subject": "Add coverage for empty CATCH() blocks.", "body": "Currently empty CATCH() blocks are always marked as covered because of the loop structure of error handling.\n\nA prototype implementation of error handling without looping has shown that these CATCH() blocks are not covered without new tests. Whether or not that prototype gets committed it is worth adding the tests." }, { "commit": "e2eea974c144f77448aa9d5fbb55c933b70ea5ad", "date": "2021-10-26 12:09:41 -0400", "subject": "Add assertion for Coverity.", "body": "Coverity thinks this value might be NULL but that should not be possible because of the TRY...CATCH block." }, { "commit": "4f10441574761c9cd4e31cdef750742e004ae669", "date": "2021-10-26 08:25:21 -0400", "subject": "Add missing paragraph tags in coding standards." }, { "commit": "7fb99c59c88fe11c679d6ba7835f995a969462c0", "date": "2021-10-26 07:46:48 -0400", "subject": "Use externed instead of extern'd in comments.", "body": "This is mostly to revert some comment changes in b11ab9f7 that will break the ppc64le patch, but at the same time keep the spelling consistent in all comments and documentation.\n\nAlso revert some space changes for the same reason." }, { "commit": "653ffcf8d98ebfe94ae44ed54b4a295428c57850", "date": "2021-10-25 15:42:28 -0400", "subject": "Adjustments for new breaking change in Azurite.", "body": "Azurite released another breaking change (see fbd018cd, 096829b3, c38d6926, and Azurite issue 1039) so make adjustments as needed to documentation and tests.\n\nAlso remove some dead code that hid the repo-storage-host option and was made obsolete by all these changes." }, { "commit": "13d4559708819787ad05be6f37ec0badb0eccae5", "date": "2021-10-25 15:31:39 -0400", "subject": "Check return value of getsockopt().", "body": "Checking the return value is not terribly important here, but if setsockopt() fails it is likely that bind() will fail as well. May as well get it over with and this makes Coverity happy." }, { "commit": "1152f7a7d64e69eed1d9e74b48a308f6c742c28a", "date": "2021-10-25 12:56:33 -0400", "subject": "Fix mismatched parameters in tlsClientNew() call.", "body": "3879bc69 added this call and the parameters were not quite right but in way that the compiler decided they were OK. It was mostly working but TLS verification was disabled if caPath was NULL, which is not OK." }, { "commit": "a1a2284c881ba6c3b9b1c316b31e0583c006f1af", "date": "2021-10-25 09:01:22 -0400", "subject": "Fix typos in error messages." }, { "commit": "3879bc69b888daa04d2ca98a2d1219cf22519ddc", "date": "2021-10-22 18:31:55 -0400", "subject": "Add WebIdentity authentication for AWS S3.", "body": "This allows credentials to be automatically acquired in an EKS environment." }, { "commit": "51785739f44b624091246c48af6defe97c30d7a7", "date": "2021-10-22 18:02:20 -0400", "subject": "Store config values as a union instead of a variant.", "body": "The variants were needed to easily serialize configurations for the Perl code.\r\n\r\nUnions are more efficient and will allow us to add new types that are not supported by variants, e.g. StringId." }, { "commit": "2cea005f740d640290a9948595f5933833e30e7d", "date": "2021-10-22 17:19:16 -0400", "subject": "Fix segfault on invalid GCS key file." }, { "commit": "cb36fec102855bf268ec5234bbb5261be98bdc61", "date": "2021-10-21 17:48:00 -0400", "subject": "Add analytics to the HTML documentation." }, { "commit": "a63e732987bc1f6f26514568dabc6c0b23df07ab", "date": "2021-10-21 17:25:32 -0400", "subject": "Fix indentation." }, { "commit": "78e1bd333068c4a857054490698115fa2c698e0b", "date": "2021-10-21 17:10:00 -0400", "subject": "Move v1 documentation links out of the introduction.", "body": "There should be few if any users running v1 now so these links do not need to be so prominent." }, { "commit": "861df2a73cafbd49049dccdc55d5214b00dd3cec", "date": "2021-10-21 17:02:46 -0400", "subject": "Add GitHub repository link to index.html and README.md." }, { "commit": "1cb8ae15de5b4276682bdd9825ca97012cd43855", "date": "2021-10-21 13:51:59 -0400", "subject": "Fix incorrect host name in user guide.", "body": "The text indicates to populate the pg-primary IP address into the pg_hba.conf file to allow replication connections. It should indicate to populate the pg-standby IP address" }, { "commit": "b11ab9f799aa6fc32dd03e96e8a0428d5c83d9ae", "date": "2021-10-21 13:31:22 -0400", "subject": "Fix typos." }, { "commit": "8ad6b7330e1ee6bcbc0f06ec0562a433e7888f44", "date": "2021-10-21 09:20:40 -0400", "subject": "Fix outdated comment.", "body": "This check was moved from within the path checks at some point but the comment did not get updated." }, { "commit": "fbd018cd56482efff425beb4026fe22482115138", "date": "2021-10-20 08:22:37 -0400", "subject": "Allow S3/Azure Docker images to be specified in user guide.", "body": "It is not uncommon for the S3/Azure emulators we use to introduce breaking changes without warning. If that happens the documentation can still be built by specifying a working version of the image. In general, it is better to let the version float so we know when things break.\n\nAzurite has yet another breaking change coming up (see 096829b3, c38d6926, and Azurite issue 1039) so set azure-image at the current version until the breaking change has been released." }, { "commit": "5dfdd6dd5b7b43dc3a223b9552aed4052d0db3aa", "date": "2021-10-19 12:45:20 -0400", "subject": "Add -Werror -Wfatal-errors -g flags to configure --enable-test.", "body": "These flags are used for all tests but it was not possible to add them to configure before the change in 046d6643. This is especially important for adhoc tests to ensure the flags are not forgotten.\n\nRemove the flags from test make commands where they were being applied.\n\nThere is no change for production builds." }, { "commit": "046d6643373859c5e848a97e06389ed2aa553723", "date": "2021-10-19 12:14:09 -0400", "subject": "Set most compiler flags as late as possible in configure.", "body": "Some flags, e.g. -Wfatal-errors, will cause tests in configure to behave incorrectly so we have not been able to add them to --enable-test.\n\nAdd the compiler flags as late as possible so configure checks are not affected. This will allow us to add flags that we need for testing without having to explicitly pass them to make." }, { "commit": "e443e3c6c05c9d65a67dac0c8430b59239fbc1b8", "date": "2021-10-19 09:06:06 -0400", "subject": "Add br tags for HTML documentation rendering missed in def7d513." }, { "commit": "4c2d89eb66e11017e2e73ad4171e4493c28acdad", "date": "2021-10-18 16:43:19 -0400", "subject": "Fix typos." }, { "commit": "6cc8e45df68c299990c0ad1f40c53b9282cb46db", "date": "2021-10-18 14:45:36 -0400", "subject": "Add missing paragraph tag in user guide." }, { "commit": "ccc255d3e05d8ce2b6ac251d1498f71b04098a86", "date": "2021-10-18 14:32:41 -0400", "subject": "Add TLS Server.", "body": "The TLS server is an alternative to using SSH for protocol connections to remote hosts.\n\nThis command is currently experimental and intended only for trial and testing. As such, the new commands and options will not show up in the command-line help unless directly requested." }, { "commit": "09fb9393f14b47effebaecc449a97ad07ef4c752", "date": "2021-10-18 14:02:05 -0400", "subject": "Write command configuration overrides explicitly.", "body": "If not written explicitly then it is impossible to distinguish the override from a NULL, which indicates no override." }, { "commit": "90f7f11a9f71152185219bbb57bf1de001e3a91b", "date": "2021-10-18 12:22:48 -0400", "subject": "Add missing static keywords in test modules." }, { "commit": "4570c7e27528400373ece8dc7bd348baf3ff064e", "date": "2021-10-18 11:32:53 -0400", "subject": "Allow error buffer to be resized for testing.", "body": "Some tests can generate very large error messages for diffs and they often get cut off before the end.\n\nAlso fix a test so it does not create too large a buffer on the stack." }, { "commit": "838ee3bd08c739e3dcf611e9bddfaa6c8acbb2aa", "date": "2021-10-18 11:05:53 -0400", "subject": "Increase some storage test timeouts.", "body": "32-bit Debian 9 is sometimes timing out on these tests so increase the timeouts to make the tests more reliable." }, { "commit": "6b9e19d423d99d3063c4bff3d3533b1e5081e4cb", "date": "2021-10-16 12:35:47 -0400", "subject": "Convert configuration optional rules to pack format.", "body": "The previous format was custom for configuration parsing and was not as expressive as the pack format. An immediate benefit is that commands with the same optional rules are merged.\n\nDefaults are now represented correctly (not multiplied), which simplifies the option default functions used by help." }, { "commit": "360cff94e4e9e1ab5a690a1f5c38eb278158a892", "date": "2021-10-16 12:33:31 -0400", "subject": "Update 32-bit test container to Debian 9.", "body": "Also rebalance PostgreSQL version integration tests." }, { "commit": "0e84c19a9fde0480b30078f5d3b419267b2f7673", "date": "2021-10-15 17:50:54 -0400", "subject": "Remove allow range from pg-socket-path option.", "body": "The allow range was never processed because the string type does not allow ranges, but it is wasteful to have it in the parse rules.\n\nIt would be good if auto-generation errored on useless allow ranges, but that will need wait since it does not impact production." }, { "commit": "144469b9772bad14466fcafc65edef58c5366755", "date": "2021-10-15 15:50:55 -0400", "subject": "Add const buffer functions to Pack type.", "body": "These allow packs to be created without allocating a buffer in the case that the buffer already exists or the data is in a global constant.\n\nAlso fix a rendering issue in hrnPackReadToStr()." }, { "commit": "66bfd1327e56f0f2de99fc6009431f3ee06ad6b8", "date": "2021-10-13 19:48:41 -0400", "subject": "Rename SSH connection control parameters in integration tests." }, { "commit": "447b24309d02938d04e036ec7814e75982210eb4", "date": "2021-10-13 19:43:40 -0400", "subject": "Update RHEL package URL." }, { "commit": "01b20724daf4c5cb25d6f636fb90456759773d22", "date": "2021-10-13 19:36:59 -0400", "subject": "Rename PostgreSQL pid file constants and tests." }, { "commit": "570162040864b8c56b236aa66d8f5c8d610b754b", "date": "2021-10-13 19:02:58 -0400", "subject": "Rename manifest file primary flag in tests." }, { "commit": "a44f9e373b47354a09bec0eaf2f3bf9e261c6941", "date": "2021-10-13 13:21:04 -0400", "subject": "Update Vagrantfile to Ubuntu 20.04." }, { "commit": "b16e827d69364408ea687a8e4b8894f7e889792e", "date": "2021-10-13 13:20:11 -0400", "subject": "Do not show output of PostgreSQL upgrade status check in user guide.", "body": "On some platforms the output may contain UTF-8 characters that the latex code is not prepared to handle.\n\nShowing the command is much more important than showing the output, so no big loss." }, { "commit": "5e84645ac030544b572036c08d055885d96d8905", "date": "2021-10-13 12:16:47 -0400", "subject": "Update comments referring to the PostgreSQL primary." }, { "commit": "430efff98a5b8dcf7c048f383abc12d9c0e5bbf0", "date": "2021-10-13 12:01:53 -0400", "subject": "Update documentation/links to main branch." }, { "commit": "1212668d5eff51756b0719b5296f7640e8096605", "date": "2021-10-13 11:43:14 -0400", "subject": "Update contributing.xml with rendering changes from def7d513.", "body": "Also update help.xml path missed in f4e1babf." }, { "commit": "90c73183ea5de6a63a23a9047ae2debb3f59b940", "date": "2021-10-13 09:37:03 -0400", "subject": "Add libc6-dbg required by updated valgrind to Vagrantfile/Dockerfile." }, { "commit": "c2d4552b7328489d703dc03defa769b1ccb8f739", "date": "2021-10-13 08:51:58 -0400", "subject": "Add debug options to code generation make in test.pl." }, { "commit": "bd91ebca759d2d6cfc2f7aa660366f5f7f09994a", "date": "2021-10-12 16:16:05 -0400", "subject": "Remove command overrides for output options.", "body": "The overrides are not needed since both commands require the same default and allow list." }, { "commit": "e8e346bc8738815b07bd80a37ace862fdab3dc1d", "date": "2021-10-12 08:53:12 -0400", "subject": "Remove command overrides for restore-only options.", "body": "The overrides are not needed since these options are only valid for one command." }, { "commit": "576b04763477877d4f2a61ad692703e20471901b", "date": "2021-10-11 16:25:36 -0400", "subject": "Invert required in set option to simplify generated rules." }, { "commit": "980b777a4a0cc07200ccb06cc55b89181101b266", "date": "2021-10-09 12:39:54 -0400", "subject": "Fix indentation." }, { "commit": "cc7f2eea900d0a7a429ffdc2cd45ead88c0298a9", "date": "2021-10-09 12:37:25 -0400", "subject": "Add assert in pckReadNext() to prevent reading into a field." }, { "commit": "610bfd736ef091a8298e5602d41aba86f10189bb", "date": "2021-10-09 12:34:45 -0400", "subject": "Increase tolerance for 0ms sleep in common/time test." }, { "commit": "7ab8dcbe6e007831e8a7f9f26e08f083c1026388", "date": "2021-10-09 12:15:19 -0400", "subject": "Read tag size in pckReadTagNext().", "body": "Rather than reading the size everywhere it is needed, get it when the tag is read, if it exists.\n\nThis simplifies logic around consuming the data when not needed. There are more use cases for this coming up." }, { "commit": "ed68792e765411a994d8ac79e4d047bbafc25582", "date": "2021-10-07 19:57:28 -0400", "subject": "Rename strNewN() to strNewZN().", "body": "Make the function name consistent with other functions that accept zero-terminated strings, e.g. strNewZ() and strCatZN()." }, { "commit": "b7e17d80ea02d70e56327828310c578af51795b5", "date": "2021-10-07 19:43:28 -0400", "subject": "More efficient memory allocation for Strings and String Variants.", "body": "The vast majority of Strings are never modified so for most cases allocate memory for the string with the object. This results in one allocation in most cases instead of two. Use strNew() if strCat*() functions are needed.\n\nUpdate varNewStr() in the same way since String Variants can never be modified. This results in one allocation in all cases instead of three. Also update varNewStrZ() to use STR() instead of strNewZ() to save two more allocations." }, { "commit": "208641ac7fd22f676a55e7305b3e69df574f36f8", "date": "2021-10-07 18:50:56 -0400", "subject": "Use constant string for user/group in performance/type test.", "body": "It is not safe to return strings created with STRDEF() from a function." }, { "commit": "74d3131830646c9f71ffa9847729cae40e3aa866", "date": "2021-10-07 14:58:11 -0400", "subject": "More efficient generation of diff/incr backup label." }, { "commit": "498902e885c9d7a44648ccc01b20454094d5b742", "date": "2021-10-07 12:18:24 -0400", "subject": "Allow \"global\" as a stanza prefix.", "body": "A stanza name like global_stanza was not allowed because the code was not selective enough about how a global section should be formatted.\r\n\r\nUpdate the config parser to correctly recognize global sections." }, { "commit": "338102861fd0ea4d2773b010dee34a39a96ad702", "date": "2021-10-07 11:01:48 -0400", "subject": "Improve instructions for rebuilding pgbackrest during stress testing." }, { "commit": "fb3f6928c9aef499938e195fbb612c1940a2dc19", "date": "2021-10-06 19:27:04 -0400", "subject": "Add configurable storage helpers to create repository storage.", "body": "Remove the hardcoded storage helpers from storageRepoGet() except for the the built-in Posix helper and the special remote helper.\n\nThe goal is to make storage driver development a bit easier by isolating as much of the code as possible into the driver module. This also makes coverage reporting much simpler for additional drivers since they do not need to provide coverage for storage/helper.\n\nConsolidate the CIFS tests into the Posix tests since CIFS is just a special case of the Posix.\n\nTest all storage features in the Posix test so that other storage driver tests do not need to provide coverage for storage/storage.\n\nRemove some dead code in the storage/s3 test." }, { "commit": "cfd823355af2ac99f30c5e1393a121c6dbf622b7", "date": "2021-10-06 12:38:56 -0400", "subject": "Refactor S3 storage driver for additional auth methods.", "body": "Currently only two auth methods are supported and a lot of refactoring is required to add a third one.\n\nDo the refactoring now to reduce noise in the commit that adds the third auth method." }, { "commit": "68c5f3eaf18fc9bd10fde15a323890bcdcbf4534", "date": "2021-10-05 17:59:05 -0400", "subject": "Allow link-map option to create new links.", "body": "Currently link-map only allows links that exist in the backup manifest to be remapped to a new destination.\r\n\r\nAllow link-map to create a new link as long as a valid path/file from the backup is referenced." }, { "commit": "f2aeb30fc706c04d6200cefeeb2645229a31ff69", "date": "2021-10-05 14:06:59 -0400", "subject": "Add state to ProtocolClient.", "body": "This is currently only useful for debugging, but in the future the state may be used for resetting the protocol when something goes wrong." }, { "commit": "2c65fed80f47124283d4f8be92f987dc55237f48", "date": "2021-10-05 12:29:16 -0400", "subject": "Add missing asserts and move temp mem context block." }, { "commit": "6af827cbb1e78cd4c5d649ed4cb24c49a7204b8f", "date": "2021-10-05 09:00:16 -0400", "subject": "Report original error and retries on local job failure.", "body": "The local process will retry jobs (e.g. backup file) but after a certain number of failures gives up. Previously, the last error was reported but generally the first error is far more valuable. The last error is likely to be a cascade failure such as the protocol being out of sync.\r\n\r\nReport the first error (and stack trace) and append the retry errors to the first error without stack trace information." }, { "commit": "34f78734325743b4e34cb39224852e6debf49750", "date": "2021-10-04 13:45:53 -0400", "subject": "Report backup file validation errors in backup.info.", "body": "Currently errors found during the backup are only available in text output when specifying --set.\r\n\r\nAdd a flag to backup.info that is available in both the text and json output when --set is not specified. This at least provides the basic info that an error was found in the cluster during the backup, though details are still only available as described above." }, { "commit": "57c62315465972f6b85558020198134e34cf2ee0", "date": "2021-10-02 17:27:33 -0400", "subject": "Add arm64 testing on Cirrus CI.", "body": "These tests run in a container without permissions to mount tempfs, so add an option to ci.pl to not create tempfs. Also add some packages not in the base image." }, { "commit": "f1ed8f0e5112d1a74d86168e67632be55eddb416", "date": "2021-10-02 16:29:31 -0400", "subject": "Sort WAL segment names when reporting duplicates.", "body": "Make the output consistent even when files are listed in a different order. This is purely for testing purposes, but there is no harm in consistent output.\n\nFound on arm64." }, { "commit": "71047a9d6d1eea71b3fbd430983541a54049cc69", "date": "2021-10-02 16:17:33 -0400", "subject": "Use strncpy() to limit characters copied to optionName.", "body": "Valgrind complained about uninitialized values on arm64 when comparing the reset prefix, probably because \"reset\" ended up being larger than the option name: Conditional jump or move depends on uninitialised value(s) at cfgParseOption (parse.c:568).\n\nCoverity complained because it could not verify the size of the string to be copied into optionName, probably because it does not understand the purpose of strSize(): You might overrun the 65-character fixed-size string \"optionName\" by copying the return value of \"strZ\" without checking the length.\n\nUse strncpy() even though we have already checked the size and make sure the string is terminated. Keep the size check because searching for truncated option names is not a good idea.\n\nThis is not a production bug since the code has not been released yet." }, { "commit": "b792a14cd7dbdfb61362700ffc5fc01997db890c", "date": "2021-10-01 18:23:03 -0400", "subject": "Use temp mem context when calling command handlers.", "body": "It is safer and more efficient to free memory after each handler completes.\n\nThe db command handlers use the server context so update them to use the top context." }, { "commit": "ae40ed6ec9cf77e577518b128518a5763767f589", "date": "2021-10-01 17:15:36 -0400", "subject": "Add jobRetry parameter to HRN_CFG_LOAD().", "body": "Allow the default of 0 to be overridden to test retry behavior for commands." }, { "commit": "136d309dd4bc1ada9f3d775f036b62292fda390b", "date": "2021-10-01 15:29:31 -0400", "subject": "Allow stack trace to be specified for errorInternalThrow().", "body": "This allows the stack trace to be set when an error is received by the protocol, rather than appending it to the message. Now these errors will look no different than any other error and the stack trace will be reported in the same way.\n\nOne immediate benefit is that test.pl --vm-out --log-level-test=debug will work for tests that check expect log results. Previously, the test would error at the first check because the stack trace included in the message would not match the expected log output." }, { "commit": "62f6fbe2a9ecd9bc48611ecfaf27a05b9f36a87d", "date": "2021-10-01 10:15:34 -0400", "subject": "Update file mode in info/manifest test to 0600.", "body": "0400 is not a very realistic mode. It may have become the default due to copy-pasting." }, { "commit": "0690cb25a077735780b2fe24343c946f8a4efbc6", "date": "2021-09-30 17:55:38 -0400", "subject": "Remove repository format 6 notes.", "body": "The notes have been moved to a Github project." }, { "commit": "376362475e3b3b13b70313d9002d4ff8b25b4b40", "date": "2021-09-30 16:15:45 -0400", "subject": "Move archive-header-check option to the archive reference section." }, { "commit": "cf1a57518fe3230886509f59fcec9c9a81e6513c", "date": "2021-09-30 14:29:49 -0400", "subject": "Refactor restoreManifestMap() to be driven by link-map.", "body": "This will allow new links to be added in a future commit. The current implementation is driven by the links that already exist in the manifest, which would make the new use case more complex to implement.\n\nAlso, add a more helpful error when a tablespace link is specified." }, { "commit": "d89a67776cfbb7b3047dbe297cc0c768e5c670e8", "date": "2021-09-30 13:39:29 -0400", "subject": "Refactor restoreManifestMap() tests in the command/restore unit.", "body": "Add test titles, new tests, and rearrange.\n\nAlso manifestTargetFindDefault(), which will soon be used by core code in a refactoring commit." }, { "commit": "7a53ba7c7f38bff2c9ef99c6ab58d22c59dd2290", "date": "2021-09-30 13:28:14 -0400", "subject": "Add note to comment for int64 typedef." }, { "commit": "815377cc6009c800b6a5fdc1fe98ddfceaaae824", "date": "2021-09-30 13:27:14 -0400", "subject": "Finalize catalog number for PostgreSQL 14 release." }, { "commit": "baf186bfb05ca683714a59485900892f5e7e8a1b", "date": "2021-09-29 12:03:01 -0400", "subject": "Fix comment typos." }, { "commit": "9e79f0e64b661e944ce2b3897c366feea1544ac2", "date": "2021-09-29 10:31:51 -0400", "subject": "Add recovery start time to online backup restore log.", "body": "This helps give an idea of how much recovery needs to be done to reach the end of the WAL stream and is easier to read than the backup label." }, { "commit": "9346895f5b61627b50d431f09447e08d8a50caa8", "date": "2021-09-29 09:58:47 -0400", "subject": "Rename page checksum error to error list in info text output.", "body": "\"error list\" makes it clearer that other errors may be reported. For example, if checksum-page is true in the manifest but no checksum-page-error list is provided then the error is in alignment, i.e. the file size is not a multiple of the page size, with allowances made for a valid-looking partial page at the end of the file.\r\n\r\nIt is still not possible to differentiate between alignment and page checksum errors in the output but this will be addressed in a future commit." }, { "commit": "b7ef12a76f219881d0b24c715592af96fe5c9b8f", "date": "2021-09-28 15:55:13 -0400", "subject": "Add hints to standby replay timeout message." }, { "commit": "096829b3b257444162417662612e80d8cb2ac6ec", "date": "2021-09-27 09:01:53 -0400", "subject": "Add repo-azure-uri-style option.", "body": "Azurite introduced a breaking change in 8f63964e to use automatically host-style URIs when the endpoint appears to be a multipart hostname.\n\nThis option allows the user to configure which style URI will be used, but changing the endpoint might cause breakage if Azurite decides to use a different style. Future changes to Azurite may also cause breakage." }, { "commit": "c8ea17c68f8fa72f2bf3b979539be3f709448493", "date": "2021-09-24 17:40:31 -0400", "subject": "Convert page checksum filter result to a pack.", "body": "The pack is both more compact and more efficient than a variant.\n\nAlso aggregate the page error info in the main process rather than in the filter to allow additional LSN filtering, to be added in a future commit." }, { "commit": "ac1f6db4a25520b1ee957b66925dde2e1ef156ab", "date": "2021-09-23 14:06:00 -0400", "subject": "Centralize and optimize tag stack management.", "body": "The push and pop code was duplicated in four places, so centralize the code into pckTagStackPop() and pckTagStackPush().\n\nAlso create a default bottom item for the stack to avoid allocating a list if there will only ever be the default container, which is very common. This avoids the extra time and memory to allocate a list." }, { "commit": "15e7ff10d3d6fe3570335a5abec5ff683c07e2e6", "date": "2021-09-23 08:31:32 -0400", "subject": "Add Pack pseudo-type.", "body": "Rather than working directly with Buffer types, define a new Pack pseudo-type that represents a Buffer containing a pack. This makes it clearer that a pack is being stored and allows stronger typing." }, { "commit": "131ac0ab5e98500569d4ff6985d31c55a9ef53b9", "date": "2021-09-22 11:18:12 -0400", "subject": "Rename pckReadNew()/pckWriteNew() to pckReadNewIo()/pckWriteNewIo().", "body": "These names more accurately describe the purpose of the constructors." }, { "commit": "0e76ccb5b7b2089f8d0300ab0086c454aebbbbbf", "date": "2021-09-22 10:48:21 -0400", "subject": "Convert filter param/result to Pack type.", "body": "The Pack type is more compact and flexible than the Variant type. The Pack type also allows binary data to be stored, which is useful for transferring the passphrase in the CipherBlock filter.\n\nThe primary purpose is to allow more (and more complex) result data to be returned efficiently from the PageChecksum filter. For now the PageChecksum filter still returns the original Variant. Converting the result data will be the subject of a future commit.\n\nAlso convert filter types to StringId." }, { "commit": "802373cb9df28384529fe5a7bd102bfe5c8f3911", "date": "2021-09-21 10:16:16 -0400", "subject": "Limit valgrind error output to the first error.", "body": "Generally the first error is the only important error. The rest simply lead to a lot of scrolling." }, { "commit": "473afce57bc7646c53bae4a6300b41b11b5b0357", "date": "2021-09-20 11:03:50 -0400", "subject": "Copy data page before verifying checksum.", "body": "Using UNCONSTIFY() is potentially dangerous since the buffer is modified while calculating the checksum, even though the page is reverted to the original state. Instead make a copy to ensure that the original data is never modified.\n\nThis requires the logic to be shuffled a bit since the copy cannot be made until we are sure the page is complete." }, { "commit": "0efb8adb9452cb8bc67ebaa8b6dca0d4a69c1682", "date": "2021-09-19 20:38:51 -0400", "subject": "Automatically include all PostgreSQL version interface files." }, { "commit": "95d814cf81b84d162f40717346c3ad0cb642f724", "date": "2021-09-19 20:32:27 -0400", "subject": "Specify size for helpData array." }, { "commit": "912a498b0bcc988bd5ebaaa22198956d752251bd", "date": "2021-09-11 16:07:59 -0400", "subject": "Skip comments when rendering help output.", "body": "Comments should not appear in the help. They are simply notes on implementation." }, { "commit": "c38d6926d6c9aa01b895a28e66fc0aa6965350a3", "date": "2021-09-09 08:48:45 -0400", "subject": "Revert Azurite version for testing to 3.14.0.", "body": "3.14.2 is causing breakage in the documentation. There is no obvious cause so for now just revert to the last working version." }, { "commit": "f4e1babf6b4ce7087ace8221cac7cadb51488f0e", "date": "2021-09-08 18:16:06 -0400", "subject": "Migrate command-line help generation to C.", "body": "Command-line help is now generated at build time so it does not need to be committed. This reduces churn on commits that add configuration and/or update the help.\n\nSince churn is no longer an issue, help.auto.c is bzip2 compressed to save space in the binary.\n\nThe Perl config parser (Data.pm) has been moved to doc/lib since the Perl build path is no longer required.\n\nLikewise doc/xml/reference.xml has been moved to src/build/help/help.xml since it is required at build time." }, { "commit": "def7d513cdd2d4579acf6e8c675a3d6f7da4f655", "date": "2021-09-08 17:35:45 -0400", "subject": "Eliminate linefeed formatting from documentation.", "body": "Linefeeds were originally used in the place of

    tags to denote a paragraph. While much of the linefeed usage has been replaced over time, there were many places where it was still being used, especially in reference.xml. This made it difficult to get consistent formatting across different output types. In particular there were formatting issues in the command-line help because it is harder to audit than HTML or PDF.\n\nReplace linefeed formatting with proper

    tags to make formatting more consistent.\n\nRemove double spaces in all text where

    tags were added since it does not add churn.\n\nUpdate all