Executing the configurator script

Execute the Big Data Protector configurator script to:

  1. Download the certificates from ESA.
  2. Create the installation files for the Big Data Protector.

To execute the configurator script:

  1. Log in to the staging machine that has connectivity to ESA.

  2. To execute the configurator script, run the following command:

    ./BDPConfigurator_CDP-AWS-DataHub-7.3_<BDP_Version>.sh
    
  3. Press ENTER.

    The prompt to continue the configuration of Big Data Protector appears.

    *******************************************************************************
             Welcome to the Big Data Protector Configurator Wizard
    *******************************************************************************
    This will setup the Big Data Protector Installation Files for CDP AWS Data Hub.
    
    Do you want to continue? [yes or no]:
    
  4. To continue, type yes.

  5. Press ENTER. The prompt to select the type of installation files appears.

    Big Data Protector Configurator started...
    Unpacking...
    Extracting files...
    
    
    Select the type of Installation files you want to generate.
    [ 1: Create All ]      : Creates entire Big Data Protector CSDs, Parcels, Recipes and other files.
    [ 2: Update PTY_CERT ] : Creates new PTY_CERT parcel with an incremented patch version.
                            Use this if you have updated the ESA certificates.
    [ 3: Update PTY_LOGFORWARDER_CONF ]
                        : Creates new PTY_LOGFORWARDER_CONF parcel with an incremented patch version.
                            Use this if you want to set Custom LogForwarder configuration files to
                            forward logs to an External Audit Store.
    
    [ 1, 2 or 3 ]:
    

    Note: From v10.0.0, the PTY_FLUENTBIT_CONF parcel is renamed to PTY_LOGFORWARDER_CONF.

  6. To create the Big Data Protector parcels and CSDs, type 1.

  7. To update the PTY_CERT parcels with an incremented patch version, type 2.

  8. To update the PTY_LOGFORWARDER_CONF parcel with an incremented patch version, type 3.

  9. Press ENTER. The prompt to select the operating system for the Cloudera Manager parcel appears.

    Select the OS version for Cloudera Manager Parcel.
    This will be used as the OS Distro suffix in the Parcel name.
    
    [ 1: el7 ]    :  RHEL 7 and clones (CentOS, Scientific Linux, etc)
    [ 2: el8 ]    :  RHEL 8 and clones (CentOS, Scientific Linux, etc)
    [ 3: el9 ]    :  RHEL 9 and clones (CentOS, Scientific Linux, etc)
    [ 4: sles12 ] :  SuSE Linux Enterprise Server 12.x
    [ 5: sles15 ] :  SuSE Linux Enterprise Server 15.x
    
    Enter the no.:
    
  10. Depending on the requirements, type 1, 2, 3, 4 or 5 to select the operating system version for the Big Data Protector parcels.

  11. Press ENTER. The prompt to enter the S3 URI to upload the installation file appears.

    Enter the S3 URI where the BDP Installation files are to be uploaded.
    
    (E.g. s3://examplebucket/folder):
    
  12. Enter the location to upload the installation files.

  13. Press ENTER. The prompt to select the upload method appears.

    Choose one option among the following for BDP Installation files:
    [ 1 ] : Upload files to 's3://<bucket_name>/<directory_name>/' S3 URI.
    [ 2 ] : Generate files locally to current working directory. (You would have to manually upload the files to the specified S3 URI)
    
    [ 1 or 2 ]:
    
  14. To upload the files, type 1.

  15. Press ENTER. The prompt to select the authentication option appears.

    Choose the Type of AWS Access Keys from the following options:
    [ 1 ] : IAM User Access Keys (Permanent access key id & secret access key)
    [ 2 ] : Temporary Security Credentials (Temporary access key id, secret access key & session token)
    
    [ 1 or 2 ]:
    
  16. Depending upon the authentication option, the script will prompt for the following inputs:

    OptionDescription
    1Prompts to enter the following permanent IAM user access keys:
    AWS_ACCESS_KEY_ID
    AWS_SECRET_ACCESS_KEY
    2Prompts to enter the following temporary security credentials:
    AWS_ACCESS_KEY_ID
    AWS_SECRET_ACCESS_KEY
    AWS_SESSION_TOKEN
  17. Enter the required credentials.

  18. Press ENTER. The prompt to enter ESA hostname or IP address appears.

    Enter ESA Hostname or IP Address:
    
  19. Enter ESA IP address.

  20. Press ENTER. The prompt to enter ESA listening port appears.

    Enter ESA host listening port [8443]:
    
  21. Enter the listening port number.

  22. Press ENTER.

    The prompt to enter ESA JSON Web Token appears.

    If you have an existing ESA JSON Web Token (JWT) with Export Certificates role, enter it otherwise enter 'no':
    

    Note: The script silently reads the user input. Therefore, the user will be unable to see the entered JWT or no.

  23. Enter the JWT token.

    a. If you do not have an existing ESA JSON Web Token (JWT), type no.

    b. Press ENTER.
    The prompt to enter the user name with Export Certificates permission appears.

    ```
    JWT was not provided. Script will now prompt for ESA username and password.
    Enter ESA Username with Export Certificates role:
    ```
    

    c. Enter the username that has permissions to export the certificates.

    d. Press ENTER.
    The prompt to enter the password appears. Enter the password for username <user_name>: e. Enter the password.

    f. Press ENTER.
    The script retrieves the JWT from ESA, validates it, and the prompt to package custom log forwarder configuration appears.

     Fetching JWT from ESA....
    
     Fetching Certificates from ESA....
    
     % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                     Dload  Upload   Total   Spent    Left  Speed
     100 11264  100 11264    0     0   202k      0 --:--:-- --:--:-- --:--:--  203k
    
     -------------------------------------------------------------------------------
    
    
     Do you want to package any custom LogForwarder configuration files for External Audit Store?
     [ yes ] : Create a PTY_LOGFORWARDER_CONF parcel containing configuration files to be used with External Audit Store.
     [ no ]  : Skip this step.
    
     [ yes or no ]:
    
     ```
    
  24. To package the Log Forwarder configuration file(s) for an external Audit Store, type yes.

  25. Press ENTER.
    The prompt to enter the local directory path containing the Log Forwarder configuration files appears.

    Do you want to package any custom LogForwarder configuration files for External Audit Store?
    [ yes ] : Create a PTY_LOGFORWARDER_CONF parcel containing configuration files to be used with External Audit Store.
    [ no ]  : Skip this step.
    
    [ yes or no ]: yes
    
    Creation of PTY_LOGFORWARDER_CONF parcel is enabled.
    
    Enter the local directory path on this machine that stores the LogForwarder configuration files for External Audit Store:
    

    Note: The PTY_LOGFORWARDER_CONF parcel is used to package any custom Log Forwarder configuration files that the user provides and can be distributed across the CDP nodes through the Cloudera Manager. Ensure that you name the custom Log Forwarder configuration files for the external Audit Store with the .conf extension.

  26. Enter the local directory path that contains the Log Forwarder configuration files.

  27. Press ENTER.
    The script generates the installation files and uploads them to the specified S3 URI.

    Generating Installation files...
    
    ****************************************************************************************************************************************
    
    Retrieving the S3 bucket's AWS Region via AWS S3 REST API...
    Successfully retrieved S3 bucket's AWS region: <region_name>
    
    Started uploading the Installation files to S3 bucket using REST API.
    
    Uploading BDP_PEP-<BDP_version>.jar...
    -> File uploaded to s3://<bucket_name>/<directory_name>/CSDandParcels/BDP_PEP-<BDP_version>.jar
    
    Uploading PTY_BDP-<BDP_version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel...
    -> File uploaded to s3://<bucket_name>/<directory_name>/CSDandParcels/PTY_BDP-<BDP_version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel
    
    Uploading PTY_BDP-<BDP_Version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel.sha...
    -> File uploaded to s3://<bucket_name>/<directory_name>/CSDandParcels/PTY_BDP-<BDP_version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel.sha
    
    Uploading PTY_CERT-<BDP_Version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel...
    -> File uploaded to s3://<bucket_name>/<directory_name>/CSDandParcels/PTY_CERT-<BDP_version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel
    
    Uploading PTY_CERT-<BDP_Version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel.sha...
    -> File uploaded to s3://<bucket_name>/<directory_name>/CSDandParcels/PTY_CERT-<BDP_version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel.sha
    
    Uploading pepimpala4_0_RHEL.so...
    -> File uploaded to s3://<bucket_name>/<directory_name>/pepimpala/pepimpala4_0_RHEL.so
    
    Uploading createobjects.sql...
    -> File uploaded to s3://<bucket_name>/<directory_name>/pepimpala/sqlscripts/createobjects.sql
    
    Uploading dropobjects.sql...
    -> File uploaded to s3://<bucket_name>/<directory_name>/pepimpala/sqlscripts/dropobjects.sql
    
    Uploading BDP_Pre-Service-Deployment_Recipe_<BDP_version>.sh...
    -> File uploaded to s3://<bucket_name>/<directory_name>/RecipesAndTemplates/BDP_Pre-Service-Deployment_Recipe_<BDP_version>.sh
    
    Uploading BDP_Post-Cloudera-Manager-Start_Recipe_<BDP_version>.sh...
    -> File uploaded to s3://<bucket_name>/<directory_name>/RecipesAndTemplates/BDP_Post-Cloudera-Manager-Start_Recipe_<BDP_version>.sh
    
    Uploading custom_properties_template.json...
    -> File uploaded to s3://<bucket_name>/<directory_name>/RecipesAndTemplates/custom_properties_template.json
    
    Uploading guide_to_create_cluster_template_with_bdp.txt...
    -> File uploaded to s3://<bucket_name>/<directory_name>/RecipesAndTemplates/guide_to_create_cluster_template_with_bdp.txt
    
    Uploading PTY_LOGFORWARDER_CONF-<BDP_Version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel...
    -> File uploaded to s3://<bucket_name>/<directory_name>/CSDandParcels/PTY_LOGFORWARDER_CONF-<BDP_version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel
    
    Uploading PTY_LOGFORWARDER_CONF-<BDP_Version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel.sha...
    -> File uploaded to s3://<bucket_name>/<directory_name>/CSDandParcels/PTY_LOGFORWARDER_CONF-<BDP_version>_CDP7.3.p<patch_version>-<operating_system_version>.parcel.sha
    
    Successfully uploaded Installation files under ./Installation_Files to S3 URI: s3://<bucket_name>/directory_name
    
    ****************************************************************************************************************************************
    
    * The BDP CSD & Parcels (and checksums) are generated locally in ./Installation_Files/CSDandParcels/ directory.
    
    * BDP Recipes, Custom Properties and Custom Cluster Template creation guide are generated locally in ./Installation_Files/RecipesAndTemplates/ directory.
    -> Follow the guide to create a custom Cluster Template and use it along with the 2 Recipes and Custom Properties on CDP AWS.
    
    * The pepimpala .so library is generated locally in ./Installation_Files/pepimpala/ directory.
    
    * The pepimpala SQL scripts to create and drop Impala UDFs is generated locally in ./Installation_Files/pepimpala/sqlscripts/ directory.
    -> Use these scripts as reference to register Protegrity Impala UDFs if you plan to use the Impala Service.
    Note: The location clause in the Create Function query points to the S3 URI of the pepimpala*.so
    
    ****************************************************************************************************************************************
    
    Successfully configured the Big Data Protector Installaton files for CDP AWS DataHub.
    

    If you select the option to generate the installation files locally, the configurator script creates the files in a local directory.

    Generating Installation files...
    
    ****************************************************************************************************************************************
    
    * The BDP CSD & Parcels (and checksums) are generated locally in ./Installation_Files/CSDandParcels/ directory.
    -> Manually upload them to 's3://<bucket_name>/<directory_name>/CSDandParcels/' [This step is Required]
    
    * BDP Recipes, Custom Properties and Custom Cluster Template creation guide are generated locally in ./Installation_Files/RecipesAndTemplates/ directory.
    -> Follow the guide to create a custom Cluster Template and use it along with the 2 Recipes and Custom Properties on CDP AWS.
    -> Manually upload them to 's3://<bucket_name>/<directory_name>/RecipesAndTemplates/' [This step is Optional]
    
    * You can use the ./Installation_Files/set_unset_bdp_config.sh helper script for setting/unsetting BDP configs in Cloudera Manager.
    
    * The pepimpala .so library is generated locally in ./Installation_Files/pepimpala/ directory.
    -> Manually upload the library to 's3://<bucket_name>/<directory_name>/pepimpala/' [This step is Required]
    
    * The pepimpala SQL scripts to create and drop Impala UDFs is generated locally in ./Installation_Files/pepimpala/sqlscripts/ directory.
    -> Use these scripts as reference to register Protegrity Impala UDFs if you plan to use the Impala Service.
    Note: The location clause in the Create Function query points to the S3 URI of the pepimpala*.so
    -> Manually upload them to 's3://<bucket_name>/<directory_name>/pepimpala/sqlscripts/' [This step is Optional]
    
    ****************************************************************************************************************************************
    
    Successfully configured the Big Data Protector Installaton files for CDP AWS DataHub.
    

Last modified : February 20, 2026