For the Application Protector REST Approach
Ensure that the following prerequisites are available before installing the Big Data Protector:
Python3 along with the requests module is installed on the machine to execute the configurator script.
A compatible version of ESA is installed, configured, and running.
Access to the Databricks workspace is available.
A Databricks cluster, of any one of the following type, is created and is in the running state:
- Dedicated Compute
- Standard Compute
Create the Databricks Service Principal.
The Databricks Service Principal must have the Can attach to permission on the cluster.
Create the following certificates for mutual TLS authorization:
- CA Certificate
- Server Certificate
- Non-encrypted Server Key
- Client Certificate
- Non-encrypted Client Key
Note: These certificates must be generated ONLY after retrieving the IP address of the Application Protector REST server.
Permission to create a Secrets Manager and store secrets is available.
Create an AWS Databricks Unity Catalog Service Credential.
Note: For more information about creating the credential, refer to https://docs.databricks.com/aws/en/connect/unity-catalog/cloud-services/service-credentials.
The Databricks Service Principal must have the access permissions on the Databricks Unity Catalog Service Credential.
A Databricks Unity Catalog Volume is available with a Catalog and a Schema and the following permissions:
- The Databricks Service Principal must have the Read volume and Write volume permission on the Databricks Unity Catalog Volume.
- The Databricks Service Principal must have the Use catalog permission at the Catalog level.
- The Databricks Service Principal must have the Use schema permission at the Schema level.
- The Databricks Service Principal must have the Create function permission at the Schema level.
- The Databricks Service Principal must have the manage permission at the Schema level.
Feedback
Was this page helpful?