Computers Windows Internet

Key distribution protocols. Key management Key distribution involving a key distribution center

Key management

In addition to choosing a cryptographic system suitable for a particular IC, an important issue is key management. No matter how complex and secure the cryptosystem itself is, it is based on the use of keys. If to ensure the confidential exchange of information between two users, the key exchange process is trivial, then in an IS, where the number of users is tens and hundreds, key management is a serious problem.

Under key information is understood as the totality of all active keys in the IS. If a sufficiently reliable control of key information is not provided, then having taken possession of it, the attacker gets unlimited access to all information.

Key management- information process, which includes three elements:

* key generation;

* accumulation of keys;

* distribution of keys.

Let's consider how they should be implemented in order to ensure the security of key information in the IS.

Key generation

At the very beginning of the conversation about cryptographic methods, it was said that you should not use non-random keys in order to make them easy to remember. Serious ICs use special hardware and software methods to generate random keys. As a rule, PSC sensors are used. However, the degree of randomness of their generation should be sufficiently high. Ideal generators are devices based on "natural" random processes. For example, serial samples of key generation based on white radio noise. Another random mathematical object is the decimal places of irrational numbers, for example, or e, which are calculated using standard mathematical methods.

In ISs with medium security requirements, software key generators are quite acceptable, which calculate the PRNG as a complex function of the current time and (or) the number entered by the user.

Key accumulation

Under accumulation of keys refers to the organization of their storage, accounting and removal.

Since the key is the most attractive object for an attacker, opening the way to confidential information for him, special attention should be paid to the issues of key accumulation.

Secret keys should never be written explicitly on a medium that can be read or copied.

In a fairly complex IS, one user can work with a large amount of key information, and sometimes it even becomes necessary to organize mini-databases according to key information. Such databases are responsible for the acceptance, storage, accounting and deletion of the used keys.

So, each information about the keys used must be stored in encrypted form. Keys that encrypt key information are called master keys. It is desirable that each user knows the master keys by heart, and does not store them at all on any material media.

A very important condition for the security of information is the periodic updating of key information in the IS. In this case, both ordinary keys and master keys should be reassigned. In especially responsible IS, it is desirable to update key information on a daily basis.

The issue of updating key information is also related to the third element of key management - key distribution.

Key distribution

Key distribution is the most critical process in key management. There are two requirements for him:
  1. Efficiency and accuracy of distribution
  2. Secrecy of distributed keys.
Recently, there has been a noticeable shift towards the use of cryptosystems with a public key, in which the problem of key distribution disappears. Nevertheless, the distribution of key information in IS requires new effective solutions.

Distribution of keys between users is implemented by two different approaches:

  1. By creating one or more key distribution centers. The disadvantage of this approach is that the distribution center knows to whom and what keys are assigned, and this allows you to read all messages circulating in the IS. Possible abuses significantly affect protection.
  2. Direct key exchange between users of the information system.
In this case, the problem is to reliably authenticate the subjects.

In both cases, the authenticity of the communication session must be guaranteed. This can be provided in two ways:

  1. Request-response mechanism, which consists of the following. If user A wants to be sure that the messages he receives from B are not false, he includes an unpredictable element (request) in the message sent to B. When answering, user B must perform some operation on this element (for example, add 1). This cannot be done in advance, since it is not known what random number will come in the request. After receiving a response with the results of the actions, user A can be sure that the session is authentic. The disadvantage of this method is the possibility of establishing a albeit complex pattern between the request and the response.
  2. Timestamp mechanism ("timestamp"). It means fixing the time for each message. In this case, each IS user can know how "old" the incoming message is.
In both cases, encryption should be used to ensure that the response was not sent by an attacker and the timestamp has not been altered.

The use of timestamps raises the problem of the allowable delay time interval for session authentication. After all, a message with a "temporal stamp" in principle cannot be transmitted instantly. In addition, the computer clocks of the recipient and the sender cannot be absolutely synchronized. What delay of the "stamp" is considered suspicious.

Therefore, in real ICs, for example, in credit card payment systems, it is the second mechanism of authentication and counterfeit protection that is used. The interval used is from one to several minutes. A large number of known methods of stealing electronic money are based on "wedging" into this gap with false withdrawal requests.

Public key cryptosystems can be used for key exchange using the same RSA algorithm.

But the Diffie-Hellman algorithm turned out to be very effective, allowing two users to exchange a key without intermediaries, which can then be used for symmetric encryption.

Diffie-Hellman algorithm

Diffie and Helman proposed to create cryptographic systems with a public key discrete exponentiation function.

The irreversibility of the transformation in this case is ensured by the fact that it is quite easy to calculate the exponential function in a finite Galois field consisting of p elements. ( p- either a prime number or prime to any power). Calculation of logarithms in such fields is a much more time-consuming operation.

If a y= x, 1<x<p-1, where is a fixed element of the field GF(p), then x= lo g y above GF(p). Having x, it is easy to calculate y. This will require 2 ln( x+y) multiplication operations.

Inverse calculation problem x from y will be quite difficult. If a p is chosen correctly enough, then extracting the logarithm will require calculations proportional to

L(p) = exp((ln p ln ln p) 0.5 }

To exchange information, the first user chooses a random number x 1 , equiprobable of integer 1... p-one. He keeps this number in secret, and sends the number to another user

y 1 = x mod p

The second user does the same, generating x 2 and calculating y 2 , sending it to the first user. As a result, they can calculate k 12 = x 1 x 2 mod p.

In order to calculate k 12 , the first user raises y 2 to the power x one . The second user does the same. Thus, both users have a common key k 12 , which can be used to encrypt information with conventional algorithms. Unlike the RSA algorithm, this algorithm does not allow you to encrypt the actual information.

Without knowing x 1 and x 2 , an attacker might try to compute k 12 , knowing only intercepted y 1 and y 2. The equivalence of this problem to the problem of calculating the discrete logarithm is a major and open question in public key systems. A simple solution has not yet been found. So, if for the direct transformation of 1000-bit prime numbers 2000 operations are required, then for the inverse transformation (calculation of the logarithm in the Galois field) - it will take about 10 30 operations.

As you can see, with all the simplicity of the Diffie-Hellman algorithm, its second drawback in comparison with the RSA system is the absence of a guaranteed lower estimate of the complexity of opening the key.

In addition, although the described algorithm circumvents the problem of hidden key transmission, the need for authentication remains. Without additional funds, one of the users cannot be sure that he exchanged keys with exactly the user he needs. The danger of imitation remains in this case.

As a generalization of what has been said about the distribution of keys, the following should be said. The task of key management comes down to finding such a key distribution protocol that would provide:

* the possibility of refusal from the key distribution center;

* Mutual authentication of session participants;

* Confirmation of session authenticity by a request-response mechanism, using software or hardware for this;

* using the minimum number of messages in the key exchange.

As complex and secure as the cryptosystem itself is, it is based on the use of keys. If the process of exchanging keys is trivial to ensure the confidential exchange of information between two users, then in a system where the number of users is tens and hundreds, key management is a serious problem.

The key information is understood as the totality of all keys operating in the system. If a sufficiently reliable management of key information is not provided, then, having taken possession of it, the attacker gets unlimited access to all information.

Key management is an information process that includes three elements:

    key generation;

    accumulation of keys;

    key distribution.

Key generation. Real systems use special hardware and software methods to generate random keys. As a rule, random number generators are used. However, the degree of randomness of their generation should be sufficiently high. Ideal generators are devices based on “natural” random processes. For example, key generation based on white radio noise. Another random mathematical object is the decimal places of irrational numbers, such as  or e, which are calculated using standard mathematical methods.

In systems with medium security requirements, software key generators are quite acceptable, which calculate random numbers as a complex function of the current time and / or a number entered by the user.

Key accumulation. Under the accumulation of keys is understood the organization of their storage, accounting and deletion.

Since the key is the most attractive object for an attacker, opening the way for him to confidential information, the issues of key accumulation should be given special attention.

Secret keys should never be written explicitly on a medium that can be read or copied.

In a fairly complex system, one user can work with a large amount of key information, and sometimes it even becomes necessary to organize minidatabases of key information. Such databases are responsible for accepting, storing, recording and deleting the keys used.

Each information about the keys used must be stored in encrypted form. Keys that encrypt key information are called master keys. It is desirable that each user knows the master keys by heart and does not store them at all on any material media.

A very important condition for information security is the periodic updating of key information in the system. In this case, both ordinary keys and master keys should be reassigned. In especially critical systems, key information must be updated daily.

The issue of updating key information is also related to the third element of key management - key distribution.

Key distribution. Key distribution is the most critical process in key management. It has two requirements:

    efficiency and accuracy of distribution;

    secrecy of distributed keys.

Recently, there has been a noticeable shift towards the use of public key cryptosystems, in which the problem of key distribution disappears. Nevertheless, the distribution of key information in the system requires new efficient solutions.

The distribution of keys between users is implemented by two different approaches:

1 By creating one or more key distribution centers. The disadvantage of this approach is that the distribution center knows to whom and which keys are assigned, and this allows you to read all the messages circulating in the system. Possible abuses significantly affect protection.

2 Direct key exchange between system users. In this case, the problem is to reliably authenticate the subjects.

In both cases, the authenticity of the communication session must be guaranteed. This can be provided in two ways:

1 The request-response mechanism, which is as follows. If user A wants to be sure that the messages he receives from user B are not false, he includes an unpredictable element (request) in the message sent to B. When answering, user B must perform some operation on this element (for example, add 1). This cannot be done in advance, since it is not known what random number will come in the request. After receiving a response with the results of the actions, user A can be sure that the session is authentic. The disadvantage of this method is the ability to establish, albeit complex, patterns between the request and the response.

2 Time stamp mechanism. It implies fixing the time for each message. In this case, each user of the system can know how “old” the incoming message is.

In both cases, encryption should be used to ensure that the response was not sent by an attacker and that the timestamp has not been altered.

When using timestamps, there is a problem with the allowable delay time interval for session authentication. After all, a message with a timestamp, in principle, cannot be transmitted instantly. In addition, the computer clocks of the recipient and sender cannot be perfectly synchronized.

Public key cryptosystems can be used for key exchange using the same RSA algorithm.

But the Diffie-Hellman algorithm turned out to be very effective, allowing two users to exchange a key without intermediaries, which can then be used for symmetric encryption.

Diffie-Hellman algorithm. Diffie and Helman proposed a discrete exponentiation function for creating public-key cryptographic systems.

The irreversibility of the transformation in this case is ensured by the fact that it is quite easy to calculate the exponential function in a finite Galois field consisting of p elements ( p is either prime or prime to any power). Calculating logarithms in such fields is a much more time-consuming operation.

To exchange information, the first user chooses a random number x 1 , equiprobable of integers from 1 to p– 1. He keeps this number a secret, and sends the number to another user y 1 = , where α is a fixed element of the Galois field GF(p), which together with p is pre-distributed among users.

The second user does the same, generating x 2 and calculating y 2 , sending it to the first user. As a result, both of them can calculate the shared secret key k 12 =
.

In order to calculate k 12 , the first user raises y 2 to the power x 1 and finds the remainder after dividing by p. The second user does the same, only using y 1 and x 2. Thus, both users have a common key k 12 , which can be used to encrypt information with conventional algorithms. Unlike the RSA algorithm, this algorithm does not encrypt the actual information.

Without knowing x 1 and x 2 , an attacker might try to compute k 12, knowing only intercepted y 1 and y 2. The equivalence of this problem to the problem of calculating the discrete logarithm is a major and open question in public key systems. A simple solution has not yet been found. So, if the direct transformation of 1000-bit prime numbers requires 2000 operations, then the inverse transformation (calculating the logarithm in the Galois field) will require about 1030 operations.

As can be seen, despite the simplicity of the Diffie-Hellman algorithm, its disadvantage compared to the RSA system is the absence of a guaranteed lower bound on the complexity of key disclosure.

In addition, although the described algorithm circumvents the problem of hidden key transmission, the need for authentication remains. Without additional funds, one of the users cannot be sure that he exchanged keys with the exact user he needs.

Key Management

In addition to choosing a cryptographic system suitable for a particular IC, key management is an important issue. As complex and secure as the cryptosystem itself is, it is based on the use of keys. If to ensure the confidential exchange of information between two users, the process of exchanging keys is trivial, but in an IS, where the number of users is tens and hundreds, key management is a serious problem.

The key information is understood as the totality of all the keys operating in the IS. If a sufficiently reliable management of key information is not provided, then having taken possession of it, the attacker gets unlimited access to all information.

Key management is an information process that includes three elements: key generation, key accumulation, key distribution.

Key generation

Do not use non-random keys in order to make them easy to remember. In serious ICs, special hardware and software methods for generating random keys are used. The degree of randomness of their generation should be sufficiently high. Ideal generators are devices based on natural random processes. For example, serial samples of key generation based on white radio noise have appeared.

Key accumulation

Under the accumulation of keys is understood the organization of their storage, accounting and deletion. Secret keys should never be written explicitly on a medium that can be read or copied.

In a fairly complex IS, one user can work with a large amount of key information, and sometimes it even becomes necessary to organize mini-databases of key information. Such databases are responsible for accepting, storing, accounting and deleting the keys used.

A very important condition for the security of information is the periodic updating of key information in the IS. In especially responsible IS, it is desirable to update key information on a daily basis.

Key distribution

This is the most responsible process in key management. Two requirements are imposed on it: the efficiency and accuracy of distribution and the secrecy of distributed keys.

Distribution of keys between users is implemented by two different approaches:

by creating one or more key distribution centers.

The disadvantage of this approach is that the distribution center knows to whom and what keys are assigned, and this allows you to read all messages circulating in the IS. Possible abuses significantly affect protection.

direct exchange of keys between users of the information system.

In this case, the problem is to reliably authenticate the subjects.

In both cases, the authenticity of the communication session must be guaranteed.

As a generalization of what has been said about the distribution of keys, the following should be said. The task of key management comes down to finding a key distribution protocol that would provide:

Opportunity to opt out of a key distribution center

mutual authentication of session participants

confirmation of session authenticity by the request-response mechanism, the use of software or hardware for this, the use of a minimum number of messages in the key exchange.

Lecture 6: Cryptographic Key Management. cryptographic protocols.

Questions:

1. cryptographic protocols.

2. Distribution of secret keys.

3. Distribution of public keys.

4. Distribution of secret keys using a public key system.

1 Cryptographic protocols.

Cryptographic Protocol - a set of formalized rules that describe the sequence of actions performed by two or more parties to solve the problem of information protection using cryptography. That is, a cryptographic protocol includes some cryptographic algorithm.

In everyday life, informal protocols are used almost everywhere:

· when playing cards;

· when ordering goods by phone.

These protocols have been developed for a long time and work quite reliably.

Computer protocols are quite another matter. To do what humans do without thinking, computers need formal protocols.

To facilitate the demonstration of the operation of the protocols, several participants are used:

· Alice is the first participant.

· Bob is the second participant.

· Carol is a participant in tripartite protocols.

· Dave is a four-way protocol.

· Eve is a message interceptor.

· Mallory is an active burglar.

· Trent is a trusted intermediary.

· Walter is the warden (guards Alice and Bob).

· Peggy is a challenger (trying to prove something).

· Victor is the verifier (checks Peggy).

Distinguish:

· self-contained protocols;

· protocols with an intermediary;

· referee protocols.

In self-contained protocols the honesty of the parties is guaranteed by the protocol itself. No third party is needed to execute the protocol. The absence of disputes is ensured by the design of the protocol. This is the best type of protocol, but unfortunately these protocols are not suitable for every situation.

Alice Bob

Intermediary protocols.

mediatorcalled disinterested third party, which entrusted complete the execution of the protocol. “Disinterest” means that the intermediary is indifferent to both the result of the execution of the protocol and any of its participants. All participants in the protocol perceive the words of the mediator as the truth, all his actions are recognized as correct.

In everyday life, an intermediary can be a lawyer, an agency, a bank, etc. With computer networks, the situation is more complicated.


Protocols with an arbiter.


Arbitratoris a special type of intermediary. This is disinterested and trusted the third side. Unlike the mediator, he does not necessarily participate in the execution of each protocol, but only when disagreements arise between the parties.

Judges are an example.

Arbitration computer protocols are known. These protocols rely on the assumption that the parties are honest. However, if anyone suspects fraud, a trusted third party can expose the fraud based on the existing data set. In addition, a good arbitration protocol allows the arbiter to identify the attacker. Thus, arbitration protocols do not prevent, a detect fraud. In this case, the inevitability of detection acts as a preventive measure that discourages the attacker.

Organization of communication using symmetric cryptography.

Symmetric cryptosystem model:

1. Alice and Bob choose a cryptosystem.

2. Alice and Bob choose a key.

3. Alice encrypts the plaintext of the message using the encryption algorithm and the key.

4. Alice sends the ciphertext to Bob.

5. Bob decrypts the ciphertext using the key and gets the plaintext.

Eve, being between Alice and Bob, can only eavesdrop on the transmission at step 4, then she will have to cryptanalyze the ciphertext. This is a passive attack using only ciphertext.

Eve can eavesdrop on steps 1 and 2. In a good cryptosystem, security depends on knowing the key. This is why key management is so important in cryptography.

Mallory's active cracker can go further. At step 4, he can break the communication line. Or intercept Alice's message and replace it with her own. There is no way for Bob to recognize that the message was not sent by Alice.

Alice or Bob can give a copy of the key to Eve, and so on.

Summing up, we list disadvantages of symmetric cryptosystems:

1. Keys are just as valuable as the messages they encrypt, hence key distribution problem.

2. Upon receipt of the key, it is possible to generate false messages.

3. If each pair of network users will use a separate key, the total number of keys increases rapidly with the number of users.

n users - n (n - 1) / 2 - keys,

10 users - 45 keys,

100 users - 4950 keys, etc.

Establishing communication using public key cryptography.

1. Alice and Bob agree to use a public key cryptosystem.

2. Bob sends his public key to Alice.

3. Alice encrypts her message using Bob's public key and sends it to Bob.

4. Bob decrypts the message with his private key.

Thus, the problem of key distribution, which is painful for symmetric cryptosystems, is eliminated.


2. Distribution of secret keys.

With traditional encryption, both parties must receive the same key. For security purposes, frequent key changes are required.

So the strength of any symmetric cryptographic system depends greatly on the key distribution systems (i.e. means of delivering keys to two parties).

For two sides A and B, the distribution of keys can be organized in various ways:

1. The key is chosen by party A and physically delivered to B.

2. The key is chosen by a third party and physically delivers to A and B.

3. One of the parties transmits the new key in encrypted form using the old key.

4. The third party C delivers the key to A and B over secure communication channels, i.e. some Key Distribution Center (KDC).

The key distribution scheme (protocol) can be centralized and distributed(with an intermediary and self-sufficient).

Consider item 4.

The use of the CRC involves the organization of a hierarchy of keys (at least two levels). Communication between end users is encrypted using a temporary key called session key . The session key is received from the CRC via the same communication channels that are used for data delivery. Session keys are transmitted in encrypted form, and they are encrypted using master key , common for CRC and this user.

Master keys required N (according to the number of users). They are distributed in a non-cryptographic way (physical delivery to the addressee).

Key Distribution Scenario (Central Schema).

Suppose that user A intends to transmit information to user B and a one-time session key is required to protect the data.

User A has the secret key K. a , known only to him and CRC, and user B has K b (K a and K b are the main keys, K s – one-time session key).

The exchange of information takes place as follows:

1. User A sends a request to the CRC to receive a session key to secure communication with B.

The request sent must include:

- information that makes it possible to uniquely identify A and B ( ID A , ID B );

- some identifier N 1 , unique for each request and called an opportunity. The opportunity can be time, a counter, a random number.

2. CRC responds to user A's request, encrypting the answer with the key K a(main A). The only user who can read the response is A (hence A is sure that the message came from the CRC).

The response message includes the following elements:

· Designed for A :

S (to connect A with B).

- Request with a twist N 1 so that user A can match the response with the request.

Thus, A can make sure that his request has not been changed on the way to the CRC, and the opportunity does not allow confusing the answer to this request with the answer to previous requests.

· Designed for B .

One-time session key K s .

User ID A - ID A (for example, network address A).

Both elements are encrypted using the key KB (the master key of the CRC and B). They are supposed to be subsequently sent to B in order to establish a connection and identify A.

E Ka [ K S ||Request|| N 1 || E Kb (K S , ID A )]

3. User A saves his session key and sends to party B information from the CMC intended for B.

User B receives K s and knows that the received information came from the CRC (because it is encrypted by C B, which is known only to B and CRC).

Thus, A and B have a session key. But before exchanging data, it is desirable to do the following:

4. Using the received session key K s user B sends user A a new opportunity N2.

5. With the help of K s user A returns f (N 2 ). This is necessary to convince B that the message he originally received was not reproduced by the attacker.

Thus, not only key transfer is provided, but also authentication (steps 4 and 5).


It is not necessary to assign the key distribution function to one CRC. It is more advantageous to use some CRC hierarchy. The more often session keys are changed, the more reliable they are, but the distribution of session keys delays the start of a data exchange session and increases network load.

The use of the CRC implies that the CRC should inspire confidence and be reliably protected from encroachment. These requirements can be waived if a decentralized key distribution scheme (self-sufficient) is used.

Decentralized key distribution scheme.

The session key can be determined as a result of the following sequence of actions:


1) A sends a request to receive K s + opportunity N 1 .

2) B responds by encrypting the response using A's and B's common master key E MK m .

3) A returns f (N 2 ), encrypting with K s .

3. Distribution of public keys.

One of the main applications of the public key encryption scheme is a solution to the key distribution problem. There are two completely different uses for public key encryption in this area:

1. distribution of public keys;

2. using public key encryption to distribute secret keys.

Several methods have been proposed for distributing public keys. In fact, they can be grouped into the following general classes:

1. public announcement;

2. publicly accessible directory;

4. public key certificates.

1) Public announcement of public keys (Uncontrolled distribution) .

Any party involved in the exchange of data can provide its public key to any other party or transfer the key over the means of communication to everyone at all - an uncontrolled distribution of public keys.

This approach is convenient, but one downside: such a public announcement can be made by anyone, including an attacker. This means that someone poses as user A and sends the public key to another network user or offers such a public key for public use. While user A opens the fraud and warns other users, the forger will be able to read all encrypted messages received during this time for A, and will be able to use falsified keys for authentication.

2) Publicly accessible directory (Centralized scheme).

A higher degree of security can be provided by using a publicly available dynamic public key catalog. Some trusted center or organization should be responsible for maintaining and distributing the public catalog. Such a scheme should include the following elements:

1. An authoritative object that maintains a directory with entries of the form (name, public key) for each member.

2. Each participant registers his public key. Such registration should take place either at the personal appearance of the participant, or through secure communication channels.

3. Any participant can replace the existing key with a new one at any time using the means of authentication. (Perhaps the private key has been compromised in some way, or a lot of information has already been transferred using it.)

4. The entire catalog or updates to it are published periodically.

This scheme is more secure than individual public announcements, but it is also vulnerable. . If an adversary succeeds in obtaining the private key of an object authorized to maintain a directory, he will be able issue falsified public keys and, therefore, speak on behalf of any of the participants in the data exchange and read messages intended for any participant. Same result the enemy can achieve with changes to entries stored in the directory.

The best defensedistribution of public keys can be achieved by stricter control over the distribution of public keys.

A typical scenario is shown below. The scenario assumes the presence of some CRC authorized to maintain a dynamic catalog of public keys of all participants in the data exchange. In addition, the public key of the center is reliably known to each of the participants, but only the center knows the corresponding private key. In doing so, the following actions are performed:

(1) Initiator A sends a message with a date/time stamp (opportunity N 1 ) to an authoritative source of public keys with a request for the current public key of participant B.

(2) The authoritative source responds with a message that is encrypted using the authoritative source's private key KR auth . This message can be decrypted by originator A using the public key of the authoritative source. Therefore, sender A can be sure that the message comes from an authoritative source. This message should include the following:

· Participant B's public key , KU b ;

· Original request so that party A can verify that the request has not been modified en route to an authoritative source.


· Original date/time stamp (opportunity N 1 ) so that sender A can verify that this is the answer to this particular request.

(3) Initiator A stores party B's public key and uses it to encrypt a message sent to recipient B containing sender A's identity ( ID A ) and opportunity N 1 .

(4) (5) Respondent B receives participant A's public key from an authoritative source in exactly the same way that sender A received recipient B's public key.

By this time, the public keys have been delivered to participants A and B, so that now A and B can start a secure exchange of data. But before that it is desirable to do two next additional actions.

(6) Respondent B sends a message to initiator A, encrypted with KU A and containing the opportunity of the sender A ( N 1 ), as well as a new opportunity generated by participant B ( N 2 ). Presence N 1 in message (6) convinces participant A that the sender of the received message was B.

(7) Initiator A returns N2 encrypted with party B's public key so that party B can verify that the sender of the response is A.

So, a total of seven messages will be required. However send the first four messages infrequently, since both parties can store each other's public keys for later use, which is usually called caching.

4) Public key certificates .

An alternative approach was proposed by Confelder. It is based on certificates.

Each certificate contains public key and other information generated by an authoritative certificate authority and issued to the participant.

System Requirements :

1. Any party must be able to read the certificate to determine the name and public key of the certificate owner.

2. Any participant should be able to verify that the certificate comes from an authoritative certificate source and is not a fake.

4. Denningadded the following requirement - any participant must be able to check the validity period of the certificate.


Rice. Exchange of public key certificates.

Each participant applies to the AIS , providing a public key and requesting a certificate for it over a secure communication form.

AIS sends certificates C A and C B containing 1) the validity period of the certificate; 2) the identifier of the owner; 3) the public key of the certificate owner. The certificates are encrypted with the private key of the authoritative source.

And can send the certificate to any participant.

The recipient uses the public key KU auth AIS to read the certificate. This gives a guarantee that the certificate came from him.

D KU [ C A ]= D KU [ E KR [ T , ID A , KU A ]]=(T , ID A , KU )

4. Distribution of secret keys using a public key system.

Some users will prefer to use public key encryption only in exceptional cases, due to the fact that data transfer speeds are relatively slow when encryption is used. Therefore, public key encryption should be considered more as a means of distributing the secret keys used for traditional encryption.

1) Merkle scheme (self-contained protocol)

If initiator A intends to exchange data with user B, the following procedure is assumed:


1. Party A generates a public/private key pair ( KU A , KR A ) and sends a message to party B containing KU A and sender ID A, ID A .

2. Recipient B generates a secret key KS and transmits this key to the initiator of the message A encrypted with the public key of the initiator A.

3. User A calculates D KRa [ E KUa [ K S ]] to recover the secret key. Since only party A can decrypt this message, only parties A and B will know the value K S .

Now both parties, A and B, can use communication protected by traditional encryption with a session key K S . At the end of the data exchange, both A and B throw out K S . Despite its simplicity, this protocol is very attractive.

Dignity: No keys exist before communication starts and no keys remain after communication ends. Therefore, the risk of compromise is minimal.. At the same time, communication is protected from eavesdropping.

Flaw: This protocol is vulnerable to active attacks. If adversary E has the ability to penetrate the communication channel, then he can compromise the communication without being detected, in the following way.

1. Participant A generates a public/private key pair ( KU A , KR A KU A and participant ID A, ID A .

2. Adversary E intercepts the message, generates its own public/private key pair ( KU E , KR E ) and sends a message to addressee B containing KU E || ID A .

3. B generates a secret key K S and transmits E KUe [ K S ].

4. Adversary E intercepts this message and learns K S , calculating D KRe [ E KUe [ K S ]].

5. Adversary E sends a message to participant A E KU a [ K S ].

As a result, both participants, A and B, will know K S , but will not suspect that K S also known to the enemy E . Thus, this simple protocol is useful only when the only possible threat is passive interception of messages.

2) Distribution of secret keys with confidentiality and authentication.

The scheme provides protection against both active and passive forms of attack. As a premise, assume that A and B have already exchanged public keys using one of the schemes described above.


(1) Party A uses Party B's public key to send Party B an encrypted message containing Party A's identity ( ID A ) and opportunity (N 1 ) used to identify this particular transaction.

(2) User B decrypts (1) using KR B . User B sends a message to user A encrypted with KU A and containing the opportunity received from it ( N 1) and a new opportunity (N 2 ). Since only party B could decrypt message (1), the presence N 1 in message (2) convinces participant A that party B is the respondent.

( 3) Party A returns N 2 , encrypting the message with party B's public key to ensure that party A is the respondent.

(4) Participant A chooses a secret key K S and sends a message to participant B M = E KUb [E KRa [K S ]]. Encrypting this message with B's public key ensures that only B can read it, and encrypting it with A's private key ensures that only A can send it.

(5) Party B calculates B KU a [ E KRb [ K S ]] to recover the secret key.

When exchanging secret keys, this scheme guarantees both confidentiality and authentication.

3) Hybrid scheme (three-level).

Represents a hybrid approach applied on mainframes IBM . This brokered scheme involves the participation of a key distribution center (KDC), with which each user shares its own master secret key, and the distribution of secret session keys encrypted by the master key. A public key encryption scheme is used to distribute master keys. This three-tiered approach is based on the following logic:

· The speed of the procedure .

There are many applications where session keys need to change very frequently. Distribution of session keys using a public key scheme could make the system performance too slow due to the relatively high computational resource requirements for encryption and decryption using such a scheme. In the case of a three-level hierarchy, public-key encryption is used only occasionally to change the master key.

· backward compatibility .

The hybrid scheme can be easily implemented as an extension of an already existing scheme involving the use of the DRC, with minimal changes to the provided procedure and software.

The addition of a layer of public key encryption provides a secure and efficient means of distributing master keys. This is an advantage in a configuration where one DSC serves a large number of users located at a considerable distance from each other.

5. Diffie-Hellman key exchange.

The first published algorithm based on public keys appeared in the work of Diffie and Hellman, in which the very concept of public-key cryptography was defined. Usually this algorithm is called Diffie-Hellman key exchange. This key exchange technology is implemented in a number of commercial products. .

Purpose of the scheme– provide two users with a secure opportunity to communicate the key to each other so that they can use it to encrypt subsequent messages.

The cryptographic strength of the Diffie-Hellman algorithm relies on the difficulty of computing discrete logarithms . Formally, the discrete logarithm can be defined as follows. First, the primitive root of a prime number is determined p- the number a, the powers of which generate all integers from 1 to p-1. This means that if a is a primitive root of a prime number p , then all numbers

a mod p, a 2 mod p,…, a p-1 mod p

must be different and represent all integers from 1 to p -1 in some permutation.

The Diffie-Hellman key exchange is illustrated in the figure. In this scheme, there are two numbers open to all: a prime number q and an integer a, which is a primitive root q . Suppose users A and B intend to exchange keys.

User A chooses a random integer X A< q and computes Y A =a XA mod q . Similarly, user B independently chooses a random integer XB< q и вычисляет Y B = a XB mod q . Each side keeps the value of X secret and makes the value Y free to the other side. User A calculates the key using the formula K = ( Y B ) XA mod q , and user B by the formula K = ( Y A ) X B mod q . These two calculation formulas give the same results.

So both sides exchanged secret keys. And since at the same time X A and X B were only in personal use and therefore kept secret, the enemy will have to work only with q , a , X A, X B . Thus, he will have to calculate the discrete logarithm to determine the key. For example, to define a key.

After that, he will be able to calculate the key K in the same way as user B does.

The security of Diffie-Hellman key exchange relies in fact on the fact that while powers modulo some prime number are relatively easy to compute, discrete logarithms are very difficult to compute. For large prime numbers, the latter is considered a problem that is practically unsolvable.


The enemy knows q, a, Y A, Y B. To determine the key, you need to calculate the discrete logarithm.

As complex and secure as the cryptosystem itself is, it is based on the use of keys. If the process of exchanging keys is trivial to ensure the confidential exchange of information between two users, then in a system where the number of users is tens and hundreds, key management is a serious problem.

The key information is understood as the totality of all keys operating in the system. If a sufficiently reliable management of key information is not provided, then, having taken possession of it, the attacker gets unlimited access to all information.

Key management is an information process that includes three elements:

Key generation;

Accumulation of keys;

Key distribution.

Key generation. Real systems use special hardware and software methods to generate random keys. As a rule, random number generators are used. However, the degree of randomness of their generation should be sufficiently high. Ideal generators are devices based on “natural” random processes. For example, key generation based on white radio noise. Another random mathematical object is the decimal places of irrational numbers, such as p or e, which are calculated using standard mathematical methods.

In systems with medium security requirements, software key generators are quite acceptable, which calculate random numbers as a complex function of the current time and / or a number entered by the user.

Key accumulation. Under the accumulation of keys is understood the organization of their storage, accounting and deletion.

Since the key is the most attractive object for an attacker, opening the way for him to confidential information, the issues of key accumulation should be given special attention.

Secret keys should never be written explicitly on a medium that can be read or copied.

In a fairly complex system, one user can work with a large amount of key information, and sometimes it even becomes necessary to organize minidatabases of key information. Such databases are responsible for accepting, storing, recording and deleting the keys used.

Each information about the keys used must be stored in encrypted form. Keys that encrypt key information are called master keys. It is desirable that each user knows the master keys by heart and does not store them at all on any material media.

A very important condition for information security is the periodic updating of key information in the system. In this case, both ordinary keys and master keys should be reassigned. In especially critical systems, key information must be updated daily.


The issue of updating key information is also related to the third element of key management - key distribution.

Key distribution. Key distribution is the most critical process in key management. It has two requirements:

Efficiency and accuracy of distribution;

Secrecy of distributed keys.

Recently, there has been a noticeable shift towards the use of public key cryptosystems, in which the problem of key distribution disappears. Nevertheless, the distribution of key information in the system requires new efficient solutions.

The distribution of keys between users is implemented by two different approaches:

1 By creating one or more key distribution centers. The disadvantage of this approach is that the distribution center knows to whom and which keys are assigned, and this allows you to read all the messages circulating in the system. Possible abuses significantly affect protection.

2 Direct key exchange between system users. In this case, the problem is to reliably authenticate the subjects.

In both cases, the authenticity of the communication session must be guaranteed. This can be provided in two ways:

1 The request-response mechanism, which is as follows. If user A wants to be sure that the messages he receives from user B are not false, he includes an unpredictable element (request) in the message sent to B. When answering, user B must perform some operation on this element (for example, add 1). This cannot be done in advance, since it is not known what random number will come in the request. After receiving a response with the results of the actions, user A can be sure that the session is authentic. The disadvantage of this method is the ability to establish, albeit complex, patterns between the request and the response.

2 Time stamp mechanism. It implies fixing the time for each message. In this case, each user of the system can know how “old” the incoming message is.

In both cases, encryption should be used to ensure that the response was not sent by an attacker and that the timestamp has not been altered.

When using timestamps, there is a problem with the allowable delay time interval for session authentication. After all, a message with a timestamp, in principle, cannot be transmitted instantly. In addition, the computer clocks of the recipient and sender cannot be perfectly synchronized.

Public key cryptosystems can be used for key exchange using the same RSA algorithm.

But the Diffie-Hellman algorithm turned out to be very effective, allowing two users to exchange a key without intermediaries, which can then be used for symmetric encryption.

Diffie-Hellman algorithm. Diffie and Helman proposed a discrete exponentiation function for creating public-key cryptographic systems.

The irreversibility of the transformation in this case is ensured by the fact that it is quite easy to calculate the exponential function in a finite Galois field consisting of p elements ( p is either prime or prime to any power). Calculating logarithms in such fields is a much more time-consuming operation.

To exchange information, the first user chooses a random number x 1 , equiprobable of integers from 1 to p– 1. He keeps this number a secret, and sends the number to another user y 1 = , where α is a fixed element of the Galois field GF(p), which together with p is pre-distributed among users.

The second user does the same, generating x 2 and calculating y 2 , sending it to the first user. As a result, both of them can calculate the shared secret key k 12 = .

In order to calculate k 12 , the first user raises y 2 to the power x 1 and finds the remainder after dividing by p. The second user does the same, only using y 1 and x 2. Thus, both users have a common key k 12 , which can be used to encrypt information with conventional algorithms. Unlike the RSA algorithm, this algorithm does not encrypt the actual information.

Without knowing x 1 and x 2 , an attacker might try to compute k 12, knowing only intercepted y 1 and y 2. The equivalence of this problem to the problem of calculating the discrete logarithm is a major and open question in public key systems. A simple solution has not yet been found. So, if the direct transformation of 1000-bit prime numbers requires 2000 operations, then the inverse transformation (calculating the logarithm in the Galois field) will require about 1030 operations.

As can be seen, despite the simplicity of the Diffie-Hellman algorithm, its disadvantage compared to the RSA system is the absence of a guaranteed lower bound on the complexity of key disclosure.

In addition, although the described algorithm circumvents the problem of hidden key transmission, the need for authentication remains. Without additional funds, one of the users cannot be sure that he exchanged keys with the exact user he needs.