Why was the term “discrete” used in discrete logarithm? Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?Does classifying an integer as a discrete log require it be part of a multiplicative group?Trying to better understand the failure of the Index Calculus for ECDLPWhat is so special about elliptic curves?Why is the discrete logarithm problem assumed to be hard?What is the difference between discrete logarithm and logarithm?Calculating the discrete logarithmWhy is NON DISCRETE logarithm problem not hard as the DISCRETE logarithm problem (so computationally hard)?How to construct a hash function into a cyclic group such that its discrete log is intractable?Discrete logarithm key sizes for very short term usageDiscrete Logarithm NotationDescribing Discrete Logarithm Assumption
How fail-safe is nr as stop bytes?
Most bit efficient text communication method?
In musical terms, what properties are varied by the human voice to produce different words / syllables?
Maximum summed subsequences with non-adjacent items
macOS: Name for app shortcut screen found by pinching with thumb and three fingers
Getting prompted for verification code but where do I put it in?
What is the chair depicted in Cesare Maccari's 1889 painting "Cicerone denuncia Catilina"?
What does it mean that physics no longer uses mechanical models to describe phenomena?
Deconstruction is ambiguous
How to compare two different files line by line in unix?
What is an "asse" in Elizabethan English?
What order were files/directories output in dir?
How to write capital alpha?
How does light 'choose' between wave and particle behaviour?
What to do with repeated rejections for phd position
Project Euler #1 in C++
Crossing US/Canada Border for less than 24 hours
preposition before coffee
Why weren't discrete x86 CPUs ever used in game hardware?
Is there public access to the Meteor Crater in Arizona?
Co-worker has annoying ringtone
Conditions when a permutation matrix is symmetric
Strange behavior of Object.defineProperty() in JavaScript
How to report t statistic from R
Why was the term “discrete” used in discrete logarithm?
Planned maintenance scheduled April 23, 2019 at 23:30 UTC (7:30pm US/Eastern)
Announcing the arrival of Valued Associate #679: Cesar Manara
Unicorn Meta Zoo #1: Why another podcast?Does classifying an integer as a discrete log require it be part of a multiplicative group?Trying to better understand the failure of the Index Calculus for ECDLPWhat is so special about elliptic curves?Why is the discrete logarithm problem assumed to be hard?What is the difference between discrete logarithm and logarithm?Calculating the discrete logarithmWhy is NON DISCRETE logarithm problem not hard as the DISCRETE logarithm problem (so computationally hard)?How to construct a hash function into a cyclic group such that its discrete log is intractable?Discrete logarithm key sizes for very short term usageDiscrete Logarithm NotationDescribing Discrete Logarithm Assumption
$begingroup$
Is there anything especially "discrete" about a discrete logarithm? This is not a question of what is a discrete logarithm or why the discrete logarithm problem is an "intractable problem" given certain circumstances. I'm just trying to determine if there's some additional meaning to the term "discrete" as it's used in name discrete logarithm?
The definition of "discrete" is "individually separate and distinct". Could it be that the term "discrete" is a reference to the least non-negative residues of a modulus or the order of points for a particular cyclic group on an elliptic curve?
discrete-logarithm terminology
$endgroup$
add a comment |
$begingroup$
Is there anything especially "discrete" about a discrete logarithm? This is not a question of what is a discrete logarithm or why the discrete logarithm problem is an "intractable problem" given certain circumstances. I'm just trying to determine if there's some additional meaning to the term "discrete" as it's used in name discrete logarithm?
The definition of "discrete" is "individually separate and distinct". Could it be that the term "discrete" is a reference to the least non-negative residues of a modulus or the order of points for a particular cyclic group on an elliptic curve?
discrete-logarithm terminology
$endgroup$
13
$begingroup$
Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
$endgroup$
– Mikero
Apr 15 at 20:18
6
$begingroup$
See also discrete mathematics
$endgroup$
– BlueRaja - Danny Pflughoeft
Apr 15 at 22:48
$begingroup$
That's a pretty discreet information.
$endgroup$
– yo'
2 days ago
add a comment |
$begingroup$
Is there anything especially "discrete" about a discrete logarithm? This is not a question of what is a discrete logarithm or why the discrete logarithm problem is an "intractable problem" given certain circumstances. I'm just trying to determine if there's some additional meaning to the term "discrete" as it's used in name discrete logarithm?
The definition of "discrete" is "individually separate and distinct". Could it be that the term "discrete" is a reference to the least non-negative residues of a modulus or the order of points for a particular cyclic group on an elliptic curve?
discrete-logarithm terminology
$endgroup$
Is there anything especially "discrete" about a discrete logarithm? This is not a question of what is a discrete logarithm or why the discrete logarithm problem is an "intractable problem" given certain circumstances. I'm just trying to determine if there's some additional meaning to the term "discrete" as it's used in name discrete logarithm?
The definition of "discrete" is "individually separate and distinct". Could it be that the term "discrete" is a reference to the least non-negative residues of a modulus or the order of points for a particular cyclic group on an elliptic curve?
discrete-logarithm terminology
discrete-logarithm terminology
asked Apr 15 at 20:09
JohnGaltJohnGalt
28528
28528
13
$begingroup$
Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
$endgroup$
– Mikero
Apr 15 at 20:18
6
$begingroup$
See also discrete mathematics
$endgroup$
– BlueRaja - Danny Pflughoeft
Apr 15 at 22:48
$begingroup$
That's a pretty discreet information.
$endgroup$
– yo'
2 days ago
add a comment |
13
$begingroup$
Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
$endgroup$
– Mikero
Apr 15 at 20:18
6
$begingroup$
See also discrete mathematics
$endgroup$
– BlueRaja - Danny Pflughoeft
Apr 15 at 22:48
$begingroup$
That's a pretty discreet information.
$endgroup$
– yo'
2 days ago
13
13
$begingroup$
Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
$endgroup$
– Mikero
Apr 15 at 20:18
$begingroup$
Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
$endgroup$
– Mikero
Apr 15 at 20:18
6
6
$begingroup$
See also discrete mathematics
$endgroup$
– BlueRaja - Danny Pflughoeft
Apr 15 at 22:48
$begingroup$
See also discrete mathematics
$endgroup$
– BlueRaja - Danny Pflughoeft
Apr 15 at 22:48
$begingroup$
That's a pretty discreet information.
$endgroup$
– yo'
2 days ago
$begingroup$
That's a pretty discreet information.
$endgroup$
– yo'
2 days ago
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
The word discrete is used as an antonym of 'continuous', that is, it is the normal logarithmic problem, just over a discrete group.
The standard logarithmic problem is over the infinite group $mathbbR^*$, this group is called 'continuous', because for any element $x$, there are other elements that are arbitrarily close to it.
The discrete logarithmic problem is over a finite group (for example, $mathbbZ_p^*$); in contrast to $mathbbR^*$, we don't have group elements arbitrarily close together; we call this type of group 'discrete'.
$endgroup$
1
$begingroup$
yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
$endgroup$
– Geoffroy Couteau
Apr 15 at 20:51
8
$begingroup$
@JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
$endgroup$
– Mark
Apr 15 at 20:51
2
$begingroup$
@GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
$endgroup$
– Mark
Apr 15 at 21:04
3
$begingroup$
@Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
$endgroup$
– Geoffroy Couteau
Apr 15 at 21:08
2
$begingroup$
@JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
$endgroup$
– ruakh
Apr 16 at 0:01
|
show 11 more comments
$begingroup$
While I agree completely with poncho's answer, this other viewpoint might be useful.
Specifically, I think a better comparison isn't between $mathbbZ_p^*$ and $mathbbR^*$, but with $mathbbZ_p^*$ and $S^1$. We can view $S^1 cong zinmathbbC mid $. It's not hard to show that any $zin S^1$ is able to be written as $z = exp(2pi i t)$ for $tinmathbbR$ (we don't strictly need the factor $2pi$ here, but it's traditional). Due to $exp(x)$ being periodic, it's in fact enough to have $tin[0,1)$.
This has an obvious group structure, in that:
$$exp(2pi i t_0)exp(2pi i t_1) = exp(2pi i (t_0+t_1))$$
If we're making the restriction that $t_iin[0,1)$, then we have to take $t_0+t_1mod 1$, but this is fairly standard.
More than just having an obvious group structure, we actually have that any $mathbbZ_p^*$ injects into it.
Specifically, we always have:
$$
phi_p:mathbbZ_p^*to S^1,quad phi_p(x) = exp(2pi i x/(p-1))
$$
Here, $p-1$ in the denominator is because $|mathbbZ_p^*| = p-1$.
We can define the discrete logarithm problem for both of these groups in the standard way (here, it's important to restrict $t_iin[0, 1)$ if we want a unique answer).
Then, we can relate these problems to each via the aforementioned injection.
Through this image, we see that $S^1$ is "continuous" in the sense that it takes up the full circle, but the image of $mathbbZ_p^*$ in $S^1$ will always be "discrete" --- there will always be "some space" between points (they can't get arbitrarily close).
$endgroup$
1
$begingroup$
A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
$endgroup$
– supercat
Apr 16 at 15:32
add a comment |
$begingroup$
Just to add to the other answers, (as mentioned in some of the comments) it is exactly the discreteness of the discrete log problem is that makes it (for some parameter choices) hard. Computing $y = log_a(x)$ is the same as solving the equation $a^y = x$ for $y$. In the non-discrete case, $y mapsto a^y$ is a monotonically increasing (if $a > 1$) continuous function. Thus, you can (in the absence of even more efficient methods) use the bisection method to solve for $y$. When you have a value $y$ for which $a^y$ is close to the target $x$ then you know that $y$ is close to the value you seek. Knowing when you are close to a solution is very useful information.
In the discrete case, there is no corresponding notion of closeness. Say if for some reason you wanted to compute the base-$19$ discrete log of $7155$ (mod $34591$) and somehow find that $19^481 = 7156$ (mod $34591$). Does this imply that $log_19(7155)$ is close to $481$? Not at all. The actual value is $log_19(7155) = 28544$. It is much harder to find a solution when you can't tell when you are close.
$endgroup$
$begingroup$
Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
$endgroup$
– JohnGalt
2 days ago
1
$begingroup$
@JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
$endgroup$
– John Coleman
2 days ago
add a comment |
Your Answer
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "281"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f68801%2fwhy-was-the-term-discrete-used-in-discrete-logarithm%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The word discrete is used as an antonym of 'continuous', that is, it is the normal logarithmic problem, just over a discrete group.
The standard logarithmic problem is over the infinite group $mathbbR^*$, this group is called 'continuous', because for any element $x$, there are other elements that are arbitrarily close to it.
The discrete logarithmic problem is over a finite group (for example, $mathbbZ_p^*$); in contrast to $mathbbR^*$, we don't have group elements arbitrarily close together; we call this type of group 'discrete'.
$endgroup$
1
$begingroup$
yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
$endgroup$
– Geoffroy Couteau
Apr 15 at 20:51
8
$begingroup$
@JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
$endgroup$
– Mark
Apr 15 at 20:51
2
$begingroup$
@GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
$endgroup$
– Mark
Apr 15 at 21:04
3
$begingroup$
@Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
$endgroup$
– Geoffroy Couteau
Apr 15 at 21:08
2
$begingroup$
@JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
$endgroup$
– ruakh
Apr 16 at 0:01
|
show 11 more comments
$begingroup$
The word discrete is used as an antonym of 'continuous', that is, it is the normal logarithmic problem, just over a discrete group.
The standard logarithmic problem is over the infinite group $mathbbR^*$, this group is called 'continuous', because for any element $x$, there are other elements that are arbitrarily close to it.
The discrete logarithmic problem is over a finite group (for example, $mathbbZ_p^*$); in contrast to $mathbbR^*$, we don't have group elements arbitrarily close together; we call this type of group 'discrete'.
$endgroup$
1
$begingroup$
yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
$endgroup$
– Geoffroy Couteau
Apr 15 at 20:51
8
$begingroup$
@JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
$endgroup$
– Mark
Apr 15 at 20:51
2
$begingroup$
@GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
$endgroup$
– Mark
Apr 15 at 21:04
3
$begingroup$
@Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
$endgroup$
– Geoffroy Couteau
Apr 15 at 21:08
2
$begingroup$
@JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
$endgroup$
– ruakh
Apr 16 at 0:01
|
show 11 more comments
$begingroup$
The word discrete is used as an antonym of 'continuous', that is, it is the normal logarithmic problem, just over a discrete group.
The standard logarithmic problem is over the infinite group $mathbbR^*$, this group is called 'continuous', because for any element $x$, there are other elements that are arbitrarily close to it.
The discrete logarithmic problem is over a finite group (for example, $mathbbZ_p^*$); in contrast to $mathbbR^*$, we don't have group elements arbitrarily close together; we call this type of group 'discrete'.
$endgroup$
The word discrete is used as an antonym of 'continuous', that is, it is the normal logarithmic problem, just over a discrete group.
The standard logarithmic problem is over the infinite group $mathbbR^*$, this group is called 'continuous', because for any element $x$, there are other elements that are arbitrarily close to it.
The discrete logarithmic problem is over a finite group (for example, $mathbbZ_p^*$); in contrast to $mathbbR^*$, we don't have group elements arbitrarily close together; we call this type of group 'discrete'.
answered Apr 15 at 20:18
ponchoponcho
94.6k2151248
94.6k2151248
1
$begingroup$
yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
$endgroup$
– Geoffroy Couteau
Apr 15 at 20:51
8
$begingroup$
@JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
$endgroup$
– Mark
Apr 15 at 20:51
2
$begingroup$
@GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
$endgroup$
– Mark
Apr 15 at 21:04
3
$begingroup$
@Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
$endgroup$
– Geoffroy Couteau
Apr 15 at 21:08
2
$begingroup$
@JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
$endgroup$
– ruakh
Apr 16 at 0:01
|
show 11 more comments
1
$begingroup$
yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
$endgroup$
– Geoffroy Couteau
Apr 15 at 20:51
8
$begingroup$
@JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
$endgroup$
– Mark
Apr 15 at 20:51
2
$begingroup$
@GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
$endgroup$
– Mark
Apr 15 at 21:04
3
$begingroup$
@Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
$endgroup$
– Geoffroy Couteau
Apr 15 at 21:08
2
$begingroup$
@JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
$endgroup$
– ruakh
Apr 16 at 0:01
1
1
$begingroup$
yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
$endgroup$
– Geoffroy Couteau
Apr 15 at 20:51
$begingroup$
yes, being discrete is not the "core reason" why dlp can be hard - although note that if we are to ever use the crypto we build on a computer, things better be discrete - at best, we can only approximate a continuous primitive in a discrete way.
$endgroup$
– Geoffroy Couteau
Apr 15 at 20:51
8
8
$begingroup$
@JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
$endgroup$
– Mark
Apr 15 at 20:51
$begingroup$
@JohnGalt The hard part of the DLP is the modular reduction "hiding" how many times you've "wrapped around". Without modular reduction, you can do some sort of "binary search" to get accurate lower/upper bounds to the discrete log rather efficiently.
$endgroup$
– Mark
Apr 15 at 20:51
2
2
$begingroup$
@GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
$endgroup$
– Mark
Apr 15 at 21:04
$begingroup$
@GeoffroyCouteau This isn't precisely true. Any computable number has a finite-length turing machine that, on input $n$, will output the $n$th digit of the number. This can be viewed as a finite-length representation of the number. As an example, there are formula for $pi$ (such as the BBP formula) that compute the $i$th digit without computing all smaller digits. This is definitely a different notion of "representation" than simply storing the literal value in memory, but the computable numbers are notably not discrete (they contain $mathbbQ$ as a sub-field).
$endgroup$
– Mark
Apr 15 at 21:04
3
3
$begingroup$
@Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
$endgroup$
– Geoffroy Couteau
Apr 15 at 21:08
$begingroup$
@Mark you are perfectly right, although for the purpose of getting a high level intuition regarding this specific question (regarding discrete log), I felt like my answer would provide one. We could perhaps work with problems over more general structures through appropriate representations, though I believe this has never been done in Cryptography.
$endgroup$
– Geoffroy Couteau
Apr 15 at 21:08
2
2
$begingroup$
@JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
$endgroup$
– ruakh
Apr 16 at 0:01
$begingroup$
@JohnGalt: To add a bit to Mark and GeoffroyCouteau's comments . . . in principle, if we had a computer that could store arbitrary real numbers and perform infinite-precision math on them, then we could do the equivalent of modular reduction by (for example) just dropping the integer part and taking the fractional part. (If all you knew was that n is an integer in [0, N) and the fractional part of eⁿ was 0.1254123452312..., it would be very hard to find n.) So you can see that the discreteness really isn't what makes the discrete logarithm hard to compute.
$endgroup$
– ruakh
Apr 16 at 0:01
|
show 11 more comments
$begingroup$
While I agree completely with poncho's answer, this other viewpoint might be useful.
Specifically, I think a better comparison isn't between $mathbbZ_p^*$ and $mathbbR^*$, but with $mathbbZ_p^*$ and $S^1$. We can view $S^1 cong zinmathbbC mid $. It's not hard to show that any $zin S^1$ is able to be written as $z = exp(2pi i t)$ for $tinmathbbR$ (we don't strictly need the factor $2pi$ here, but it's traditional). Due to $exp(x)$ being periodic, it's in fact enough to have $tin[0,1)$.
This has an obvious group structure, in that:
$$exp(2pi i t_0)exp(2pi i t_1) = exp(2pi i (t_0+t_1))$$
If we're making the restriction that $t_iin[0,1)$, then we have to take $t_0+t_1mod 1$, but this is fairly standard.
More than just having an obvious group structure, we actually have that any $mathbbZ_p^*$ injects into it.
Specifically, we always have:
$$
phi_p:mathbbZ_p^*to S^1,quad phi_p(x) = exp(2pi i x/(p-1))
$$
Here, $p-1$ in the denominator is because $|mathbbZ_p^*| = p-1$.
We can define the discrete logarithm problem for both of these groups in the standard way (here, it's important to restrict $t_iin[0, 1)$ if we want a unique answer).
Then, we can relate these problems to each via the aforementioned injection.
Through this image, we see that $S^1$ is "continuous" in the sense that it takes up the full circle, but the image of $mathbbZ_p^*$ in $S^1$ will always be "discrete" --- there will always be "some space" between points (they can't get arbitrarily close).
$endgroup$
1
$begingroup$
A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
$endgroup$
– supercat
Apr 16 at 15:32
add a comment |
$begingroup$
While I agree completely with poncho's answer, this other viewpoint might be useful.
Specifically, I think a better comparison isn't between $mathbbZ_p^*$ and $mathbbR^*$, but with $mathbbZ_p^*$ and $S^1$. We can view $S^1 cong zinmathbbC mid $. It's not hard to show that any $zin S^1$ is able to be written as $z = exp(2pi i t)$ for $tinmathbbR$ (we don't strictly need the factor $2pi$ here, but it's traditional). Due to $exp(x)$ being periodic, it's in fact enough to have $tin[0,1)$.
This has an obvious group structure, in that:
$$exp(2pi i t_0)exp(2pi i t_1) = exp(2pi i (t_0+t_1))$$
If we're making the restriction that $t_iin[0,1)$, then we have to take $t_0+t_1mod 1$, but this is fairly standard.
More than just having an obvious group structure, we actually have that any $mathbbZ_p^*$ injects into it.
Specifically, we always have:
$$
phi_p:mathbbZ_p^*to S^1,quad phi_p(x) = exp(2pi i x/(p-1))
$$
Here, $p-1$ in the denominator is because $|mathbbZ_p^*| = p-1$.
We can define the discrete logarithm problem for both of these groups in the standard way (here, it's important to restrict $t_iin[0, 1)$ if we want a unique answer).
Then, we can relate these problems to each via the aforementioned injection.
Through this image, we see that $S^1$ is "continuous" in the sense that it takes up the full circle, but the image of $mathbbZ_p^*$ in $S^1$ will always be "discrete" --- there will always be "some space" between points (they can't get arbitrarily close).
$endgroup$
1
$begingroup$
A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
$endgroup$
– supercat
Apr 16 at 15:32
add a comment |
$begingroup$
While I agree completely with poncho's answer, this other viewpoint might be useful.
Specifically, I think a better comparison isn't between $mathbbZ_p^*$ and $mathbbR^*$, but with $mathbbZ_p^*$ and $S^1$. We can view $S^1 cong zinmathbbC mid $. It's not hard to show that any $zin S^1$ is able to be written as $z = exp(2pi i t)$ for $tinmathbbR$ (we don't strictly need the factor $2pi$ here, but it's traditional). Due to $exp(x)$ being periodic, it's in fact enough to have $tin[0,1)$.
This has an obvious group structure, in that:
$$exp(2pi i t_0)exp(2pi i t_1) = exp(2pi i (t_0+t_1))$$
If we're making the restriction that $t_iin[0,1)$, then we have to take $t_0+t_1mod 1$, but this is fairly standard.
More than just having an obvious group structure, we actually have that any $mathbbZ_p^*$ injects into it.
Specifically, we always have:
$$
phi_p:mathbbZ_p^*to S^1,quad phi_p(x) = exp(2pi i x/(p-1))
$$
Here, $p-1$ in the denominator is because $|mathbbZ_p^*| = p-1$.
We can define the discrete logarithm problem for both of these groups in the standard way (here, it's important to restrict $t_iin[0, 1)$ if we want a unique answer).
Then, we can relate these problems to each via the aforementioned injection.
Through this image, we see that $S^1$ is "continuous" in the sense that it takes up the full circle, but the image of $mathbbZ_p^*$ in $S^1$ will always be "discrete" --- there will always be "some space" between points (they can't get arbitrarily close).
$endgroup$
While I agree completely with poncho's answer, this other viewpoint might be useful.
Specifically, I think a better comparison isn't between $mathbbZ_p^*$ and $mathbbR^*$, but with $mathbbZ_p^*$ and $S^1$. We can view $S^1 cong zinmathbbC mid $. It's not hard to show that any $zin S^1$ is able to be written as $z = exp(2pi i t)$ for $tinmathbbR$ (we don't strictly need the factor $2pi$ here, but it's traditional). Due to $exp(x)$ being periodic, it's in fact enough to have $tin[0,1)$.
This has an obvious group structure, in that:
$$exp(2pi i t_0)exp(2pi i t_1) = exp(2pi i (t_0+t_1))$$
If we're making the restriction that $t_iin[0,1)$, then we have to take $t_0+t_1mod 1$, but this is fairly standard.
More than just having an obvious group structure, we actually have that any $mathbbZ_p^*$ injects into it.
Specifically, we always have:
$$
phi_p:mathbbZ_p^*to S^1,quad phi_p(x) = exp(2pi i x/(p-1))
$$
Here, $p-1$ in the denominator is because $|mathbbZ_p^*| = p-1$.
We can define the discrete logarithm problem for both of these groups in the standard way (here, it's important to restrict $t_iin[0, 1)$ if we want a unique answer).
Then, we can relate these problems to each via the aforementioned injection.
Through this image, we see that $S^1$ is "continuous" in the sense that it takes up the full circle, but the image of $mathbbZ_p^*$ in $S^1$ will always be "discrete" --- there will always be "some space" between points (they can't get arbitrarily close).
answered Apr 15 at 20:49
MarkMark
29116
29116
1
$begingroup$
A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
$endgroup$
– supercat
Apr 16 at 15:32
add a comment |
1
$begingroup$
A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
$endgroup$
– supercat
Apr 16 at 15:32
1
1
$begingroup$
A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
$endgroup$
– supercat
Apr 16 at 15:32
$begingroup$
A nice feature of this example is that because S1 can readily support an operator that raises values to a fractional power, one can define log functions in terms of that. Because for most pairs of (x,z) there would generally be many exponent values such that x to the power y would yield z, there are many possible functions which use different ways to select which power y gets used. A continuous-log function might select a y in S1, while a discrete-log function would select the smallest y in Z.
$endgroup$
– supercat
Apr 16 at 15:32
add a comment |
$begingroup$
Just to add to the other answers, (as mentioned in some of the comments) it is exactly the discreteness of the discrete log problem is that makes it (for some parameter choices) hard. Computing $y = log_a(x)$ is the same as solving the equation $a^y = x$ for $y$. In the non-discrete case, $y mapsto a^y$ is a monotonically increasing (if $a > 1$) continuous function. Thus, you can (in the absence of even more efficient methods) use the bisection method to solve for $y$. When you have a value $y$ for which $a^y$ is close to the target $x$ then you know that $y$ is close to the value you seek. Knowing when you are close to a solution is very useful information.
In the discrete case, there is no corresponding notion of closeness. Say if for some reason you wanted to compute the base-$19$ discrete log of $7155$ (mod $34591$) and somehow find that $19^481 = 7156$ (mod $34591$). Does this imply that $log_19(7155)$ is close to $481$? Not at all. The actual value is $log_19(7155) = 28544$. It is much harder to find a solution when you can't tell when you are close.
$endgroup$
$begingroup$
Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
$endgroup$
– JohnGalt
2 days ago
1
$begingroup$
@JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
$endgroup$
– John Coleman
2 days ago
add a comment |
$begingroup$
Just to add to the other answers, (as mentioned in some of the comments) it is exactly the discreteness of the discrete log problem is that makes it (for some parameter choices) hard. Computing $y = log_a(x)$ is the same as solving the equation $a^y = x$ for $y$. In the non-discrete case, $y mapsto a^y$ is a monotonically increasing (if $a > 1$) continuous function. Thus, you can (in the absence of even more efficient methods) use the bisection method to solve for $y$. When you have a value $y$ for which $a^y$ is close to the target $x$ then you know that $y$ is close to the value you seek. Knowing when you are close to a solution is very useful information.
In the discrete case, there is no corresponding notion of closeness. Say if for some reason you wanted to compute the base-$19$ discrete log of $7155$ (mod $34591$) and somehow find that $19^481 = 7156$ (mod $34591$). Does this imply that $log_19(7155)$ is close to $481$? Not at all. The actual value is $log_19(7155) = 28544$. It is much harder to find a solution when you can't tell when you are close.
$endgroup$
$begingroup$
Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
$endgroup$
– JohnGalt
2 days ago
1
$begingroup$
@JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
$endgroup$
– John Coleman
2 days ago
add a comment |
$begingroup$
Just to add to the other answers, (as mentioned in some of the comments) it is exactly the discreteness of the discrete log problem is that makes it (for some parameter choices) hard. Computing $y = log_a(x)$ is the same as solving the equation $a^y = x$ for $y$. In the non-discrete case, $y mapsto a^y$ is a monotonically increasing (if $a > 1$) continuous function. Thus, you can (in the absence of even more efficient methods) use the bisection method to solve for $y$. When you have a value $y$ for which $a^y$ is close to the target $x$ then you know that $y$ is close to the value you seek. Knowing when you are close to a solution is very useful information.
In the discrete case, there is no corresponding notion of closeness. Say if for some reason you wanted to compute the base-$19$ discrete log of $7155$ (mod $34591$) and somehow find that $19^481 = 7156$ (mod $34591$). Does this imply that $log_19(7155)$ is close to $481$? Not at all. The actual value is $log_19(7155) = 28544$. It is much harder to find a solution when you can't tell when you are close.
$endgroup$
Just to add to the other answers, (as mentioned in some of the comments) it is exactly the discreteness of the discrete log problem is that makes it (for some parameter choices) hard. Computing $y = log_a(x)$ is the same as solving the equation $a^y = x$ for $y$. In the non-discrete case, $y mapsto a^y$ is a monotonically increasing (if $a > 1$) continuous function. Thus, you can (in the absence of even more efficient methods) use the bisection method to solve for $y$. When you have a value $y$ for which $a^y$ is close to the target $x$ then you know that $y$ is close to the value you seek. Knowing when you are close to a solution is very useful information.
In the discrete case, there is no corresponding notion of closeness. Say if for some reason you wanted to compute the base-$19$ discrete log of $7155$ (mod $34591$) and somehow find that $19^481 = 7156$ (mod $34591$). Does this imply that $log_19(7155)$ is close to $481$? Not at all. The actual value is $log_19(7155) = 28544$. It is much harder to find a solution when you can't tell when you are close.
edited 2 days ago
answered 2 days ago
John ColemanJohn Coleman
22115
22115
$begingroup$
Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
$endgroup$
– JohnGalt
2 days ago
1
$begingroup$
@JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
$endgroup$
– John Coleman
2 days ago
add a comment |
$begingroup$
Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
$endgroup$
– JohnGalt
2 days ago
1
$begingroup$
@JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
$endgroup$
– John Coleman
2 days ago
$begingroup$
Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
$endgroup$
– JohnGalt
2 days ago
$begingroup$
Is "Knowing when you are close to a solution is very useful information." related to knowing when the value of something is greater than the value of another? That is, in a binary search algorithm used to calculate a log from an exponentiated power, it is critical to know the upper and lower limits of the target power on each test. Each iteration gets you closer and closer to the solution until the solution is reached. If you can't determine whether the test is greater than (or less than) the target, the search doesn't work.
$endgroup$
– JohnGalt
2 days ago
1
1
$begingroup$
@JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
$endgroup$
– John Coleman
2 days ago
$begingroup$
@JohnGalt They are closely related (because the standard topology on R is the order topology) but they are not the same thing. Order definitely plays a role here, but continuity can be used to solve equations even when there isn't a clear order (C does not have an order topology). Certainly the bisection method for finding a real root of a continuous function has the same basic logic as a binary search.
$endgroup$
– John Coleman
2 days ago
add a comment |
Thanks for contributing an answer to Cryptography Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f68801%2fwhy-was-the-term-discrete-used-in-discrete-logarithm%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
13
$begingroup$
Traditional logarithm: answer is a real or complex number. Discrete logarithm: answer is an element of a finite set $mathbbZ_n$.
$endgroup$
– Mikero
Apr 15 at 20:18
6
$begingroup$
See also discrete mathematics
$endgroup$
– BlueRaja - Danny Pflughoeft
Apr 15 at 22:48
$begingroup$
That's a pretty discreet information.
$endgroup$
– yo'
2 days ago