Working with the Configurator Script

The configurator script performs the following tasks:

  1. Generate the IP address for the Application Protector REST server.
  2. Create the UDFs.
  3. Delete the UDFs.

The configurator script provides the --help option to understand the options and the arguments to be provided.

To understand the options and the arguments for the configurator script:

  1. Log in to the node where the installation files are extracted.
  2. To view the options and the arguments, run the following command:
    ./BigDataProtector-Configurator_Linux-ALL-64_x86-64_AWS.Databricks-<DBR_version>-64_<BDP_version>.sh --help
    
  3. Press ENTER. The command displays all the options and the arguments required to execute the configurator script.
    This script needs the following inputs as a string:
     1. The ID of the operation.
        ----------------------------------------------------------
        | ID | Operation                                         |
        ----------------------------------------------------------
        |  1 | Get Application Protector REST's Server IP        |
        |  2 | Create Databricks Unity Catalog Batch Python UDFs |
        |  3 | Delete Databricks Unity Catalog Batch Python UDFs |
        ----------------------------------------------------------
     2. The URL of the Databricks Workspace.
     3. The Application ID of the Databricks Service Principal
     4. The OAuth Secret of the Databricks Service Principal
     5. The ID of the Databricks Compute.
    
    If the ID of the operation is specified as "2" or "3", then the script will require the following additional inputs as a string:
     6. The name of the Databricks Unity Catalog Catalog-Schema.
     7. The ID of the approach.
        -----------------------------------
        | ID | Approach                   |
        -----------------------------------
        |  1 | Application Protector REST |
        |  2 | Cloud Protector            |
        -----------------------------------
    
    If the ID of the operation is specified as "2" and the ID of the approach is specified as "1", then the script will require the following additional inputs as a string:
     8. The path of the CA Certificate.
     9. The path of the Server Certificate.
    10. The path of the Server Key.
    11. The name of the AWS Secret.
    12. The name of the AWS Secret's AWS Region.
    13. The name of the Databricks Unity Catalog Service Credential.
    14. The path of the Databricks Unity Catalog Volume.
    
    If the ID of the operation is specified as "2" and the ID of the approach is specified as "2", then the script will require the following additional inputs as a string:
     8. The name of the AWS Lambda Function.
     9. The name of the AWS Lambda Function's AWS Region.
    10. The name of the Databricks Unity Catalog Service Credential.
    
    If the ID of the operation is specified as "3" and the ID of the approach is specified as "1", then the script will require the following additional input as a string:
     8. The path of the Databricks Unity Catalog Volume.
    
    
    This script accepts the above-mentioned inputs in any one of the following ways:
     1. Using .cfg file (pass the path of the .cfg file to this script as a command-line argument).
     2. Using command-line arguments.
     3. Using interactive prompts.
    
    
    Structure of the .cfg file:
    operation_id = "operation_id"
    databricks_workspace_url = "databricks_workspace_url"
    databricks_service_principal_application_id = "databricks_service_principal_application_id"
    databricks_service_principal_oauth_secret = "databricks_service_principal_oauth_secret"
    databricks_compute_id = "databricks_compute_id"
    databricks_unity_catalog_catalog_schema_name = "databricks_unity_catalog_catalog_schema_name"
    approach_id = "approach_id"
    ca_certificate_path = "ca_certificate_path"
    server_certificate_path = "server_certificate_path"
    server_key_path = "server_key_path"
    aws_secret_name = "aws_secret_name"
    aws_secret_aws_region_name = "aws_secret_aws_region_name"
    databricks_unity_catalog_service_credential_name = "databricks_unity_catalog_service_credential_name"
    databricks_unity_catalog_volume_path = "databricks_unity_catalog_volume_path"
    aws_lambda_function_name = "aws_lambda_function_name"
    aws_lambda_function_aws_region_name = "aws_lambda_function_aws_region_name"
    
    
    Syntax of the command-line arguments:
    --operation_id "operation_id"
    --databricks_workspace_url "databricks_workspace_url"
    --databricks_service_principal_application_id "databricks_service_principal_application_id"
    --databricks_service_principal_oauth_secret "databricks_service_principal_oauth_secret"
    --databricks_compute_id "databricks_compute_id"
    --databricks_unity_catalog_catalog_schema_name "databricks_unity_catalog_catalog_schema_name"
    --approach_id "approach_id"
    --ca_certificate_path "ca_certificate_path"
    --server_certificate_path "server_certificate_path"
    --server_key_path "server_key_path"
    --aws_secret_name "aws_secret_name"
    --aws_secret_aws_region_name "aws_secret_aws_region_name"
    --databricks_unity_catalog_service_credential_name "databricks_unity_catalog_service_credential_name"
    --databricks_unity_catalog_volume_path "databricks_unity_catalog_volume_path"
    --aws_lambda_function_name "aws_lambda_function_name"
    --aws_lambda_function_aws_region_name "aws_lambda_function_aws_region_name"
    

Last modified : February 12, 2026