Tokenization
A relatively newer solution for removing sensitive data from business processes, applications, and user interaction is tokenization. This method is commonly presented as a solution to reduce PCI DSS scope and reduce business risk associated with storing credit card numbers. Tokenization is the process of generating a representation of data, called a token, and inserting the token into the processes where the original data would be used. A database is used to map the original data to the token value, allowing for both values to be retrieved if needed, and to maintain a real value for the token.
An example is when credit card numbers are inserted at point of sale and then sent on for authorization. Once authorization occurs there are only a few reasons the credit card would need to be maintained beyond the transaction. Since these reasons don't really require the credit card number itself, a unique value like a token can be used to allow business intelligence, fraud investigations...