Alpha (A-Z)
The Alpha token type tokenizes both uppercase and lowercase letters.
Table: Alpha Tokenization Type properties
Tokenization Type Properties | Settings | ||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Name | Alpha | ||||||||||||||||
Token type and Format | Lowercase letters a through z Uppercase letters A through Z | ||||||||||||||||
SLT_1_3 SLT_2_3 | Yes | Yes | 1 | 4096 | |||||||||||||
No, return input as it is | 3 | ||||||||||||||||
No, generate error | |||||||||||||||||
No | NA | 1 | 4076 | ||||||||||||||
Possibility to set Minimum/ maximum length | No | ||||||||||||||||
Left/Right settings | Yes | ||||||||||||||||
Internal IV | Yes, if
Left/Right settings are non-zero | ||||||||||||||||
External IV | Yes | ||||||||||||||||
Yes | |||||||||||||||||
Token specific properties | None | ||||||||||||||||
The following table shows examples of the way in which a value will be tokenized with the Alpha token.
Table: Examples of Numeric tokenization values
| Input Value | Tokenized Value | Comments |
|---|---|---|
| abc | nvr | Alpha, SLT_1_3, Left=0, Right=0, Length Preservation=Yes The value has minimum length for SLT_1_3 tokenizer. |
| MA | TGi | Alpha, SLT_2_3, Left=0, Right=0, Length Preservation=No The value is padded up to 3 characters which is minimum length for SLT_2_3 tokenizer. |
| MA | Error. Input too short. | Alpha, SLT_1_3, Left=0, Right=0, Length Preservation=Yes, Allow Short Data=No, generate error Input value has only two alpha characters to tokenize, which is short for SLT_1_3 tokenizer when Length Preservation=Yes and Allow Short Data=No, generate error. |
| MA MAC | MA TGH | Alpha, SLT_1_3, Left=0, Right=0, Length Preservation=Yes, Allow Short Data=No, return input as it is If the input value has less than three characters to tokenize, then it is returned as is else it is tokenized. |
| MA | TG | Alpha, SLT_1_3, Left=0, Right=0, Length Preservation=Yes, Allow Short Data=Yes Input value has only two alpha characters, which meets minimum length requirement for SLT_1_3 tokenizer when Length Preservation=Yes and Allow Short Data=Yes. |
| 131 Summer Street, Bridgewater | 131 VDYgAK q vMDUn, zAEXmwqWYNQG | Alpha, SLT_2_3, Left=0, Right=0, Length Preservation=No Numeric characters, spaces and comma are treated as delimiters and not tokenized. Output value is longer than initial value. |
| Albert Einstein | SldGzm OOCTzSFo | Alpha, SLT_1_3, Left=0, Right=0, Length Preservation=Yes Space is treated as delimiters and not tokenized. Output value is the same length as initial value. |
| Albert Einstein | AjAkqD vvBFYLdo | Alpha, SLT_1_3, Left=1, Right=0, Length Preservation=Yes 1 character from left remains in the clear. |
Alpha Tokenization Properties for different protectors
Application Protector
The following table shows supported input data types for Application protectors with the Alpha token.
Note: For both SLT_1_3 and SLT_2_3, the maximum length of the protected data is 4096 bytes. This occurs for the Alpha token element for Application Protector with no length preservation.
Table: Supported input data types for Application protectors with Alpha token
| Application Protectors*2 | AP Java*1 | AP Python |
|---|---|---|
| Supported input data types | BYTE[] CHAR[] STRING | BYTES STRING |
*1 - The API accepts and returns data in BYTE[] format. The customer application needs to convert the input into byte arrays before calling the API, and similarly, convert the output from byte arrays after receiving the response from the API.
*2 - The Protegrity Application Protector only supports bytes converted from the string data type. If any other data type is directly converted to bytes and passed as input to the Application Protector APIs that support byte as input and provide byte as output, then data corruption might occur.
For more information about Application protectors, refer to Application Protector.
Big Data Protector
Protegrity supports MapReduce, Hive, Pig, HBase, Spark, and Impala, which utilizes Hadoop Distributed File System (HDFS) or Ozone as the data storage layer. The data is protected from internal and external threats, and users and business processes can continue to utilize the secured data. Protegrity protects data inside the files using tokenization and strong encryption protection methods.
The following table shows supported input data types for Big Data protectors with the Alpha token.
Table: Supported input data types for Big Data protectors with Alpha token
| Big Data Protectors | MapReduce*2 | Hive | Pig | HBase*2 | Impala | Spark*2 | Spark SQL | Trino |
|---|---|---|---|---|---|---|---|---|
| Supported input data types*1 | BYTE[] | CHAR*3 STRING | CHARARRAY | BYTE[] | STRING | BYTE[] STRING | STRING | VARCHAR |
*1 – If the input and output types of the API are BYTE[], then the customer application should convert the input to and output from the byte array, before calling the API.
*2– The Protegrity MapReduce protector, HBase coprocessor, and Spark protector only support bytes converted from the string data type. Data that is not converted to bytes from string data type might cause data corruption to occur when:
- Any other data type is directly converted to bytes and passed as input to the MapReduce or Spark API that supports byte as input and provides byte as output.
- Any other data type is directly converted to bytes and inserted in an HBase table. Where the HBase table is configured with the Protegrity HBase coprocessor.
*3 – If you are using the Char tokenization UDFs in Hive, then ensure that the data elements have length preservation selected. In Char tokenization UDFs, using data elements without length preservation selected, is not supported.
For more information about Big Data protectors, refer to Big Data Protector.
Data Warehouse Protector
The Protegrity Data Warehouse Protector is an advanced security solution designed to protect sensitive data at the column level. This enables you to secure your data, while still permitting access to authorized users. Additionally, the Data Warehouse Protector integrates seamlessly with existing database systems using the User-Defined Functions for an enhanced security. Protegrity protects data inside the data warehouses using various tokenization and encryption methods.
The following table shows the supported input data types for the Teradata protector with the Alpha token.
Table: Supported input data types for Data Warehouse protectors with Alpha token
| Data Warehouse Protectors | Teradata |
|---|---|
| Supported input data types | VARCHAR LATIN |
For more information about Data Warehouse protectors, refer to Data Warehouse Protector.
Database Protectors
The following table shows supported input data types for Database protectors with the Alpha token.
Table: Supported input data types for Database protectors with Alpha token
| Protector | Oracle | MSSQL |
|---|---|---|
| Supported Input Data Types | VARCHAR2 CHAR | VARCHAR CHAR |
For more information about Database protectors, refer to Database Protectors
Feedback
Was this page helpful?