Procedimiento
empleado por sistemas financieros para disociar identificadores de la
transacción referenciada. De esta forma, los robs de información proporcionan
escasa información al atacante, pero el propietario auténtico puede establecer
la relación debida con la transacción correspondiente.
Tokenization is the process
of substituting a sensitive data element with an "easily" reversible
benign substitute. Easily means with regards to the data owner - the algorithm
used shouldn't be easy to guess and is the key security strength indicator of
tokenization. Tokenization can be used to safeguard sensitive data involving,
for example, bank accounts, financial statements, medical records, criminal
records, driver's licenses, loan applications, stock trades, voter registrations,
and other types of personally identifiable information (PII).
http://en.wikipedia.org/wiki/Tokenization_%28data_security%29
Tokenization is a process by
which the primary account number (PAN) is replaced with a surrogate value
called a “token”. De-tokenization is the reverse process of redeeming a token
for its associated PAN value. The security of an individual token relies
predominantly on the infeasibility of determining the original PAN knowing only
the surrogate value.
PCI Data Security Standard
(PCI DSS) -- Information Supplement: PCI DSS Tokenization Guidelines
Tokenization is the process
of replacing sensitive data with unique identification symbols that retain all
the essential information about the data without compromising its security.
Tokenization seeks to minimize the amount of data a business needs to keep on
hand.
http://searchsecurity.techtarget.com/