#### If access to practical quantum computers becomes available, all of the current public-key algorithms and associated protocols will be vulnerable. The encrypted information of all the people gaining access to the Internet, around the world, will be potentially exposed to criminals, competitors, and other adversaries. This encrypted information, including texts, emails, credit card payment information, digital signatures, online transactions, sensitive data, and government-classified secrets will be in jeopardy. To prevent all the private keys becoming irrelevant in the near future, post-quantum cryptography comes into picture. For over 50 years, the National Institute of Standards and Technology has set the standards in cryptography from DES to AES. Some of the brightest minds, from around the world, are working on the development of a new set of encryption standards, that will work with our current classical computers while being resistant to the quantum machines of the future. These new standards are expected to release in the near future. Post-Quantum Cryptography ensures the safety of all the encrypted information against the future cybersecurity threats.

#### In 1994, a mathematician named Peter Shor, developed an algorithm, which can break the security of key exchanges and digital signatures with the help of a full-blown quantum computer. Using this algorithm, a quantum computer would be able to crack todayâ€™s sophisticated information within minutes. Conversely, all of the classical computers in the world working together would take longer than the Universe has ever existed. Quantum computers operate differently from traditional computers. Quantum computers work at atomic levels and can easily shortcut physical restrictions that affect traditional computer chips. Quantum computers use qubits instead of bits. A qubit is able to represent 0 and 1 both at the same time. Once a cloud quantum computer is developed, hackers can break the software update keys and send fake updates or malware.

### Cryptography

#### The process of hiding or encoding information from adversaries (third parties) or eavesdroppers is called cryptography. In the current scenario, classic cryptography provides confidentiality, integrity, and authentication. The theory behind confidentiality is related to the access restriction of the information. The concept of integration lies in the verification that the information has not been altered, either accidentally or maliciously. Authentication verifies the identity of a party. Good Cryptography relies on three main things:

Hard Math Problems: The strength of the cryptographic algorithm relies on the hardness of some underlying math problem. If one solves the hardness of the math problem, the cryptographic firewall can be broken easily.

Proper Implementation: The cryptographic algorithms must be correctly implemented so as to not leak information. Even a slight issue can prove to be devastating for the economic condition of the communities.

Key Secrecy: The secret piece of information, called the key, is used to uncover the information, kept secret from the adversaries. The key must be kept secret at all costs by the organization, which is tough.

#### Quantum computers target the aspect of hard math problems. The issue with the current cryptographic algorithms is that their security relies on the integer factorization problem, or the discrete logarithm mathematical problem. These math problems can be solved on a quantum computer. Integer Factorization is the process of finding the factors of a prime number. If the prime number gets exponentially large, then it is very difficult for a computer to find its factors.

**The RSA algorithm:**
RSA is an asymmetric public-key cryptosystem that is widely used for the security of the information and data transmission happening on the Internet. Symmetric Encryption involves only one secret key which is used to encrypt and decrypt the information. But, in asymmetric encryption, two keys are involved: public key & private key. The information is encrypted using the public key and decrypted using the private key. RSA is a digital signature algorithm. The objective of digital signatures is to authenticate and verify the documents and data. Since, there are two separate keys to encrypt and decrypt the data, in RSA, key exchange is not necessary. There is no need of sharing the secret keys. The security of the RSA algorithm relies on the problem of factoring large prime numbers. There is no efficient algorithm to solve this hard problem on a classical computer.

####
**Shorâ€™s Algorithm on a Quantum Computer**
It is an algorithm that can find the prime factors of an extremely large integer within a short amount of time. It requires exponential amount of time to find the prime factors on a classical computer. But the Shorâ€™s algorithm would only require polynomial amount of time. The main working of this algorithm is in the process of finding the period, which is done by the Quantum Fourier Transform. The Quantum Fourier Transform or QFT performs linear transformations on the qubits. It takes some functions and finds the period of the function. A quantum computer can be in multiple states at the same time, which enables it to check the performance of the periodic function at all points simultaneously.
A QFT is a linear operator, which is the most common operator used in quantum mechanics. It is very important to represent the operator graphically in the form of quantum circuits, which consist of quantum gates. The QFT can be broken down into several basic quantum gates. Quantum Entanglement is the principal concept affecting the field of quantum mechanics, which specifies the part of state inseparability. An entangled system is a quantum state which cannot be decomposed or factored into product of states. The QFT has the ability to alter and create entanglement, which directly has a huge impact on the speedup of quantum algorithms. Quantum Fourier Transform makes use of the Hadamard Gate, which performs the mapping operation of qubits into a superposition of orthogonal states. The performance of the Shorâ€™s algorithm improves with time.

**Post-Quantum Cryptography**
Cybersecurity relies on the toughness of the factoring problem. But because the Shorâ€™s algorithm can factor efficiently on a quantum computer, cryptographic researchers from all around the world are developing highly secure algorithms for post-quantum cryptography. PQC has an added advantage over the Quantum Key Distribution Method, which is another technique for showing resistance to cryptanalytic attacks by a quantum computer. Quantum Key Distribution is very expensive, impractical and works only in extreme conditions. Therefore, PQC is the best bet for securing the Information on the Internet against the attacks by a quantum computer. The government agencies of the developed countries like the USA and UK have already begun switching to post-quantum cryptography.
The solution to seamlessly transition and update the cryptographic algorithms according to the PQC standards lies in crypto-agility. Companies like Google and IBM have accelerated the research in quantum computing. They are building the experimental quantum computers at a faster rate. The companies need to make strong plans for replacing the current vulnerable public-key algorithms with the PQC algorithms as they become available. The systems that store or transfer the most sensitive data need to be prioritized first. This includes updating the old operating systems. Exposure to third-party websites should be mitigated. It is expected from the National Institute of Standards and Technology to start the PQC standardization process as soon as the final selection of potentially secure algorithms is made. There are 5 different approaches to PQC research:

**1. Lattice-Based Cryptography:** It is a quantum-resistant cryptographic code technique that is based on hard lattice problems. A lattice or a lattice field is a set of intersection points in the space. These points are defined by the parallel and equidistant lines. To define a lattice field, vectors are needed. These vectors are known as base vectors. Lattice-Based Cryptography can be used for fully Homomorphic Encryption, where computations can be performed on the encrypted data. Mathematically, a lattice is defined as a linear combination of linearly independent vectors. The set of the base vectors is called the basis, which generates the lattice. The notion of lattice-based cryptography relies on the concept of good basis and bad basis. A lattice basis with short vectors is a good basis, otherwise bad.

#### Hard Lattice Problems include the Shortest Vector Problem (SVP) and the Closest Vector Problem (CVP). These problems are hard when there are more than two dimensions. In the Shortest Vector Problem, the goal is to find the shortest non- zero vector in the lattice. Finding very short vectors in high dimensions is still considered to be exponentially hard. The LLL algorithm is developed for the SVP that reduces the lattice basis in polynomial time, but still cannot find very short vectors in dimensions more than 150. There are two main encryption schemes associated with lattice-based cryptography: GGH and NTRU. The good basis is taken as the private key and the bad basis is taken as the public key in the GGH encryption scheme. If the basis is good, then the encrypted message can easily be decrypted. The good basis is always with the authorized user. The attackers cannot easily decrypt the message because they know only the bad basis. But, the GGH encryption scheme can be broken. There are other lattice-based PQC algorithms which are much more secure than the GGH scheme. The NTRU encryption scheme is based on the SVP and not on prime factorization. It is one of the fastest-known public-key cryptosystems. The working of NTRU involves a ring of truncated polynomials. In mathematics, truncation is limiting the degree of the polynomial coefficients to a certain value. In NTRU, the message is being represented as a polynomial. There are a lot of variants to the NTRU encryption scheme that have been developed recently. This algorithm is a strong candidate to win the competition hosted by the US led National Institute of Standards and Technology. No concrete method has been found to attack the original approach of NTRU as of February 2023. The only drawback of the lattice-based algorithms is that the key sizes are very large, which affects the encryption strength.

####
**2. Code-Based Cryptography:** This is all about Error Correcting Codes, which perform error corrections with utmost precision after partitioning the bit messages into block codes. This allows for more productive encoding and deciphering of messages. Code-Based cryptography has many encryption schemes. The most popular one is the McEliece encryption scheme. The extended version of the plaintext is fused with errors, which are random numbers. The sender then sends the ciphertext to the receiver, who does the error correction. If the errors are corrected, then it gets the plaintext.

**3. Hash-Based Cryptography:** This approach is being used in Blockchain that uses the Stateful Hash-Based Signature Scheme, which includes the XMSS. The XMSS cryptographic system uses a state which keeps on tracking the information and one-time keys and makes sure that they are not reused. Blockchain has a state already. Because of this, the XMSS has already been standardized and used in the Quantum Resistant Ledger. One-time digital keys sign only one message. The Merkle Signature Scheme uses hash trees and one-time signatures, which is secured only when the hash function is secured. Hash-based post-quantum cryptography is not as efficient as the other methods, but still does the job of resisting quantum attacks.
The biggest feature of the hash-based cryptographic approach is that only one good hash function is needed. The Secure Hash Algorithm SHA3-512 is one such example. Hash functions map long strings to fixed-length strings. One-way hash functions work like Preimage Resistance, which is a primary component of the hash function that counters the preimage attacks in the cryptographic domain. The hash functions are considered to be second preimage resistant. It is being considered to be computationally infeasible to find any second input which has the same output as a specified input. It is difficult to find the second preimage, even for a quantum computer, which makes the hash-based PQC work.

####
**4. Multivariate Cryptography:** This approach involves polynomial equations in multiple variables. It is based on the hardness of solving systems which consist of multivariate equations. Visualizing the multivariable functions is all about thinking of space with multiple dimensions which can go up to infinitely large numbers, and this is where the hardness lies. The family of cryptosystems associated with the hardness of finding solutions to these multivariate equations, turns out to be the Hidden Field Equations or HFE. In the HFE systems, the public key consists of multivariate polynomials in finite fields. The Rainbow Signature Scheme is quantum resistant.

####
**5. Isogeny-Based Cryptography:** This is about maps between elliptic curves. This is the newest form of post-quantum cryptography. It has the added advantage of having the smallest key sizes out of all the remaining post-quantum cryptosystems. But it is slower than the other algorithms. Also, the hardness problem is not studied and understood in a convincing way. Isogeny-based cryptography is reliant on the properties of elliptic curves. An elliptic curve is a set of points which satisfy a mathematical equation. The encryption mechanism of isogeny-based cryptography uses the most sophisticated math. Elliptic Curve Cryptography or ECC is being widely used by websites to secure the protocol connections of customers. It is still not quantum resistant, but is improving at a fast pace recently.

#### The quantum computers in current times use 50 to 100 qubits each. Practically, if that number exceeds 2000 qubits on one quantum computer, then it poses a security threat to all the encrypted information around the world. The National Institute of Standards and Technology has predicted that the security of the RSA algorithm will be broken within a decade. But it is also expected that PQC will be implemented on both hardware and software within the next few years. All the modern requirements can be met during the PQC deployment for secure coding. Various companies have emerged having expertise in secure implementations of newly established cryptosystems. To test the working of PQC algorithms over the world wide web, some specific tools will be required in order to detect every attack immediately. The cryptographic community knows that there should be continuous evaluation of the post-quantum cryptographic approaches.

#### Even if the research progression in PQC looks promising, it is well known that most of the PQC methods work well only in predictable environments. Post-Quantum Cryptography might lose its shape with the involvement of unpredictability. The world must be aware of the drawbacks of post-quantum cryptography which can increase with the increase in the number of qubits on a quantum computer. On top of that, performance costs may become too much. Resource-constrained devices like smartphones or IoT devices may not be able to handle the large key sizes of the PQC systems. Post-Quantum Cryptography might also face issues with ideal scalability. It is very difficult to maintain the hardness at scale. In order to completely eliminate the quantum threat, the performance speed must be at its peak and the hardness factor must be the same at any scale. If the factorization trick can be cracked by a quantum computer, then PQC must be a revelation.

**Author - Karan Jain**

Very good.very interesting. Keep it up.