Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Conferences
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Cryptography Algorithms

You're reading from   Cryptography Algorithms A guide to algorithms in blockchain, quantum cryptography, zero-knowledge protocols, and homomorphic encryption

Arrow left icon
Product type Paperback
Published in Mar 2022
Publisher Packt
ISBN-13 9781789617139
Length 358 pages
Edition 1st Edition
Languages
Concepts
Arrow right icon
Author (1):
Arrow left icon
Massimo Bertaccini Massimo Bertaccini
Author Profile Icon Massimo Bertaccini
Massimo Bertaccini
Arrow right icon
View More author details
Toc

Table of Contents (15) Chapters Close

Preface 1. Section 1: A Brief History and Outline of Cryptography
2. Chapter 1: Deep Diving into Cryptography FREE CHAPTER 3. Section 2: Classical Cryptography (Symmetric and Asymmetric Encryption)
4. Chapter 2: Introduction to Symmetric Encryption 5. Chapter 3: Asymmetric Encryption 6. Chapter 4: Introducing Hash Functions and Digital Signatures 7. Section 3: New Cryptography Algorithms and Protocols
8. Chapter 5: Introduction to Zero-Knowledge Protocols 9. Chapter 6: New Algorithms in Public/Private Key Cryptography 10. Chapter 7: Elliptic Curves 11. Chapter 8: Quantum Cryptography 12. Section 4: Homomorphic Encryption and the Crypto Search Engine
13. Chapter 9: Crypto Search Engine 14. Other Books You May Enjoy

Notes on security and computation

All the algorithms we have seen in this chapter are symmetric. The basic problem that remains unsolved is the transmission of the key. As I've already said, this problem will be overcome by the asymmetric cryptography that we will explore in the next chapter. In this section, we will analyze the computational problem related to the security of cryptographic algorithms generally speaking. Later in the book, we will focus on the security of any algorithm we will analyze.

To make a similitude, we can say that in cryptography the weak link of the chain destroys the entire chain. That is the same problem as using a very strong cryptographic algorithm to protect the data but leaving its password on the computer screen. In other words, a cryptographic algorithm has to be made of a similar grade of security with respect to mathematical problems. To clarify this concept with an example: factorization and discrete logarithm problems hold similar computational characteristics for now; however, if tomorrow one of these problems were solved, then an algorithm that is based on both would be unuseful.

Let's go deeper to analyze some of the principles universally recognized in cryptography. The first statement is Cryptography has to be open source.

With the term open source, I am referring to the algorithm and not, obviously, to the key. In other words, we have to rely on Kerckhoffs' principle, which states the following:

"A cryptosystem should be secure even if everything about the system, except the key, is public knowledge.

Kerckhoffs' principle applies beyond codes and ciphers to security systems in general: every secret creates a potential failure point. Secrecy, in other words, is a prime cause of brittleness—and therefore something likely to make a system prone to catastrophic collapse. Conversely, openness provides ductility."

– Bruce Schneier

In practice, the algorithm that underlies the encryption code has to be known. It's not useful and is also dangerous to rely on the secrecy of the algorithm in order to exchange secret messages. The reason is that, essentially, if an algorithm has to be used by an open community (just like the internet), it is impossible to keep it secret.

The second statement is The security of an algorithm depends largely on its underlying mathematical problem.

As an example, RSA, one of the most famous and most widely used algorithms in the history of cryptography, is supported by the mathematical problem of factorization.

Factorization is essentially the decomposition of a number into its divisors:

21 = 3 x 7

It's very easy to find the divisors of 21, which are 3 and 7, for small integers, but it is also well known that increasing the number of digits will exponentially increase the problem of factorization.

We will deeply analyze asymmetric algorithms such as RSA in this book, and in particular, in Chapter 3, Asymmetric Encryption, when I will explain asymmetric encryption. But here, it is sufficient to explain why RSA is used to protect financial, intelligence, and other kinds of very sensitive secrets.

The reason is that the mathematical problem underlining RSA (factorization) is still a hard problem to solve for computers of this generation. However, in this introductory section, I can't go deeper into analyzing RSA, so I will limit myself to saying that RSA suffers from not only the problem of factorization as its point of attack, but there is another equally competitive, in computational terms, problem, which is the discrete logarithm problem. Later in the book, we will even analyze both these hard computational problems. Now, we assume (incorrectly, as 99% of cryptographic texts do) that the pillar of security underlying RSA is factorization. In Chapter 6, New Algorithms in Public/Private Key Cryptography, I will show an attack on the RSA algorithm depending on a problem different from factorization. It's the similitude of the weak link of the chain explained at the beginning of this section. If something in an algorithm goes wrong, the underlying security of the algorithm fails.

Anyway, let's see what happens when we attempt to break RSA relying only on the factorization problem, using brute force. In this case, just to give you an idea of the computational power required to decompose an RSA number of 250 digits, factorizing a big semi-prime number is not easy at all if we are dealing with hundreds of digits, or thousands. Just to give you a demonstration, RSA-250 is an 829-bit number composed of 250 decimal digits and is very hard to break with a computer from the current generation.

This integer was factorized in February 2020, with a total computation time of about 2,700 core years with Intel Xeon Gold 6130 at 2.1 GHz. Like many factorization records, this one was performed using a grid of 100 machines and an optimization algorithm that elevated their computation.

The third statement is Practical security is always less secure than theoretical security.

For example, if we analyze the Vernam cipher, we can easily understand how the implementation of this algorithm in practice is very difficult. So, we can say that Vernam is invulnerable but only in theoretical security, not in practical security. A corollary of this assumption is this: implementing an algorithm means putting into practice its theoretical scheme and adding much more complexity to it. So, complexity is the enemy of security. The more complex a system is, the more points of attack can be found.

Another consideration is related to the grade of security of an algorithm. We can better understand this concept by considering Shannon's theory and the concept of perfect secrecy. The definition given by Claude Shannon in 1949 of perfect secrecy is based on statistics and probabilities. However, about the maximum grade of security, Shannon theorized that a ciphertext maintains perfect secrecy if an attacker's knowledge of the content of a message is the same both before and after the adversary inspects the ciphertext, attacking it with unlimited resources. That is, the message gives the adversary precisely no information about the message contents.

To better understand this concept, I invite you to think of different levels or grades of security, in which any of these degrees is secure but with a decreasing gradient. In other words, the highest level is the strongest and the lowest is the weakest but in the middle, there is a zone of an indefinite grade that depends on the technological computational level of the adversary.

It's not important how many degrees are supposed to be secure and how many are not. I think that, essentially, we have to consider what is certainly secure and what is not, but also what can be accepted as secure in a determinate time. With that in mind, let's see the difference between perfect secrecy and secure:

  • A cryptosystem could be considered to have perfect secrecy if it satisfies at least two conditions:
    • It cannot be broken even if the adversary has unlimited computing power.
    • It's not possible to get any information about the message, [m], and the key, [k], by analyzing the cryptogram, [c] (that is, Vernam is a theoretically perfect secrecy system but only under determinate conditions).
  • A cryptogram is secure even if, theoretically, an adversary can break the cryptosystem (that is, if they had quantum computational power and an algorithm of factorization that runs well) but the underlying mathematical problem is considered at that time very hard to solve. Under some conditions, ciphers can be used (such as RSA, Diffie-Hellmann, and El Gamal) because, based on empirical evidence, factorization and discrete logarithms are still hard problems to solve.

So, the concept of security is dynamic and very fuzzy. What is secure now might not be tomorrow. What will happen to RSA and all of the classical cryptography if quantum computers become effective, or a powerful algorithm is discovered tomorrow that is able to break the factorization problem? We will come back to these questions in Chapter 8, Quantum Cryptography. For now, I can say that most classical cryptography algorithms will be broken by the disruptive computational power of quantum computers, but we don't know yet when this will happen.

Under some conditions, we will see that the quantum exchange of the key can be considered a perfect secrecy system. But it doesn't always work, so it's not currently used. Some OTP systems could now be considered highly secure (maybe semi-perfect secrecy), but everything depends on the practical implementation. Finally, remember an important rule: a weak link in the chain destroys everything.

So, in conclusion, we can note the following:

  • Cryptography has to be open source (the algorithms have to be known), except for the key.
  • The security of an algorithm depends largely on its underlying mathematical problem.
  • Complexity is the enemy of security.
  • Security is a dynamic concept: perfect security is only a theoretical issue.
You have been reading a chapter from
Cryptography Algorithms
Published in: Mar 2022
Publisher: Packt
ISBN-13: 9781789617139
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image