Best practices when using Protegrity Anonymization
Ensure that the source file is clean based on the following checks:
- A column contains correct data values. For example, a field with numbers, such as, salary, must not contain text values.
- Appropriate text as per the coding selected is present in the files. Special characters or characters that cannot be processed must not be present in the source file.
Move the anonymized data file and the logs generated to a different system before deleting your environment.
The maximum dataframe size that can attach to an anonymization job is 100MB.
For processing a larger dataset size, users can use the different cloud storages available.
Run a maximum of 5 anonymization jobs in Protegrity Anonymization: A maximum of 5 jobs can be put on the Protegrity Anonymization queue for adequate utilization of resources. If more jobs are raised, then the job after the initial 5 jobs are rejected and are not processed. If required, increase the maximum limit for the JOB_QUEUE_SIZE parameter in the config.yaml file. For Docker, update the config-docker.yaml file.
Protegrity Anonymization accepts a maximum of 60 requests per minute: Protegrity Anonymizationcan accept a maximum of 60 request per minute. If more than 60 requests are raised, then the excess requests are rejected and are not processed. If required, increase the maximum limit for the DEFAULT_API_RATE_LIMIT parameter in the config.yaml file. For Docker, update the config-docker.yaml file.
Feedback
Was this page helpful?