Talk:Blum Blum Shub
This article is rated Start-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | ||||||||||||||||||||||||||||||||||||||||
|
This article needs to be rewritten so that a non-technical user can understand it.
[edit]At the moment, it appears as so much gobbledygook to an average reader like myself. It uses obscure terms and references concepts that few people will readily understand.
I'd make an attempt at it myself but there's so little to go on that I would do more harm than good.
Perhaps someone who understands the subject would be prepared to present the concepts in plain English?
LuciusAeliusSejanus (talk) 14:03, 24 February 2016 (UTC)
Good example of particular parameters?
[edit]Tried to get a feeling what BBS is producing, and how fast it will be on 32bit CPU.
I tried these params: M = 0xffffffe9 = 4294967273 = 10847 * 395959; p = 10847, phi(p-1) = 4480 = 2^7 * 5 * 7; q = 395959, phi(q-1) = 131984 = 2^4 * 73 * 113; gcd(phi(p-1),phi(q-1)) = 2^4 - not "big", right?
seed is, of course, shouldn't be 0 or 1.
Yet, starting with seed 3, I get
9 81 6561 43046721 3793632897 2038349057 2324068865 120648705 1995565057 2418266113 2840043521 1989099521 2099150849 977076225 1954152449 3908304897 3521642497 2748317697 1201668097 2403336193 511705089 1023410177 2046820353 4093640705 3892314113 3489660929 2684354561 1073741825 2147483649 1 1 ...
Seeds 5 and 7 perform similarly... Seems like there are some additional requiremens on params and/or seeds? In case I made silly mistake, code is:
-- most seeds have exactly the same cycle length, some seeds not but uncommun (what are the bad seeds ??)
#include <stdio.h>
#define NEXT(x) ((x)*(x)) % 0xffffffe9
int main() {
unsigned i = 7;
while(1) {
i = NEXT(i);
printf("%u\n", i);
}
return 0;
}
Googled some BBS related info: http://www.ciphersbyritter.com/NEWS2/TESTSBBS.HTM
- Your M is too large for 32-bit ints. (x)*(x) will cause integer overflow when x > 0xffff. Your code seems to work fine when using 64-bit ints.Ufretin 07:54, 10 January 2007 (UTC)
Reduction to the Factoring Problem
[edit]It is a common error to think that Blum Blum Shub reduces to the factoring problem. It reduces to the Quadratic residuosity problem. It's just that no way is known to solve it without factoring the modulus. Garfield Garfield (talk)Garfield Garfield
Parity
[edit]To anonymous editor on 64.151.29.184: I more or less reverted your change, as I found your explanation of the LSB (least significant bit) not very clear. Also, I don't know what you mean with the expression "x+1n+1". -- Jitse Niesen 12:06, 4 Feb 2004 (UTC)
- Your cleanup unfortunately muddied a few things while clarifying others; the point of the original was that output could be derived from the parity or from least-significant bits, not that they were the same thing. I've done my best to clear that up. Cheers. Andrew Rodland 05:31, 28 August 2005 (UTC)
Sorry. Just to clarify, with "bit parity", you mean the number of ones in the binary representation? -- Jitse Niesen (talk) 14:07, 28 August 2005 (UTC)
- Yes, that's what I was after. Now, I realize that the original may have simply intended a meaning of "parity" indicating the least-significant bit. You may wish to remove "either the bit parity of xn or". It's possible that parity in the sense of parity bit is used for the output of BBS, but I have no reason to believe that it is other than my (probably mistaken) reading of a previous revision. Andrew Rodland
Actually, [1] says "the output is the least significant bit of or the parity of ", so your initial reading was probably right after all. -- Jitse Niesen (talk) 12:57, 31 October 2005 (UTC)
I'm wondering if this is a misinterpretation as the article Comparison of two pseudo-random number generators, authored by Blum Blum and Shub, and linked to by this article, defines the output as Parity(), then later defines Parity() as the least significant bit of x (on page 73).--207.81.129.224 21:30, 15 November 2007 (UTC)
Symmetric or Assymetric?
[edit]I've been looking through the references in this page trying to find an answer to that question. Nowehere does it explicitly say that the algorithm is symmetric or assymmetric (I had a hard time finding decryption algorithm resources & definition, only the mention of the chinese remainder algorithm). I am guessing that it is symmetric, but from what I read I cannot say for sure. Any help would be appreciated. --Stux 15:34, 24 October 2005 (UTC)
- I am not sure what you mean here by "symmetric". As far as I can see from the article, the Blum Blum Shub algorithm only generates (pseudo)random numbers. It is not an algorithm for encrypting and decrypting text. Apologies if this is obvious to you, in which case I probably can't help. -- Jitse Niesen (talk) 16:29, 24 October 2005 (UTC)
- No apologies needed, you raise a good point (and thank you for the quick reply!) -- all of the primary descriptions of the algorithm list it as a PRNG, and it is not entirely clear how it would be used for encryption. However both the article itself and the integer factorization page itself both list it as cryptographic tools. For "symmetric" what I mean is if it is a tool for private key encrytion (using a symmetric key) or public key encrytion (using an assymetric pair/set of keys) --Stux 17:56, 24 October 2005 (UTC)
- Many cryptographic algorithms need a good PRNG. It is used both in symmetric and asymmetric encryption. In asymmetric algorithms it is used to generate big prime numbers, in symmetric algorithms it is used to generate random keys (for example GPG generates a random key, encrypts the plaintext using a symmetric algorithm (for example AES) and then sends the random key encrypted with RSA). From this point of view BBS is just a "tool" that can be used by any algorithm that needs random bits. In real world BBS is rarely used because of it is too slow, but there is an algorithm, the Blum-Goldwasser probabilistic encryption (which is asymmetric), that relays explicitly on BBS to generate keys. Sorry for poor grammar and spelling :-P. If you need more details you can contact me at my page on Wikipedia Italia: --Ippatsu (talk)
- I would like to add that, although this is not a cipher algorithm, it's based on the original Rabin cryptosystem. By the way, one statement in the article may be wrong (which we can't decide yet). Cracking RSA is not proven to be as difficult as factoring the modulus, so you currently cannot say that RSA is tied to the factoring problem, and as such the BBS generator might be even more secure than RSA. However, its security is equivalent the Rabin's security. -- Ertugrul Soeylemez
- On using the BBS generator to encrypt: it's possible to use it as part of a public-key encryption scheme. The private key is the pair , both congruent to 3 (mod 4); the public key is their product, the Blum integer . To encrypt an -bit message , one chooses a random and sets , and for ; one then transmits the ciphertext where and . To decrypt, one takes and computes by means of the Chinese Remainder Theorem. In particular, let ; then . So the recipient can recover and therefore the other and hence decrypt the message. Conversely, the bits are hard-core of the squaring map modulo , which is one-way if factoring is hard. Hence the are computationally indistinguishable from independent uniformly random bits. This proves the semantic security of the construction. It was introduced in M. Blum and S. Goldwasser, `An efficient probabilistic public-key enryption scheme which hides all partial information', CRYPTO '84.
Integers are "suspected" hard to factor?
[edit]I was thinking about adding an external link to the P vs. NP Millenium prize problem at the Clay institute, but it's much more straightforward to hyperlink the phrase "integer factorization" to the Wikipedia page with that title, so I did that instead, even though there was already another link to that page in the preceding sentence.
Move page?
[edit]I don't like the name "Blum Blum Shub" for this page. I think this material should exist at "Blum-Blum-Shub pseudorandom generator" because (1) that title makes it clear what the article is about, and (2) the standard in crypto literature is to hyphenate author names when written out, eg. Diffie-Hellman key exchange, Goldwasser-Micali cryptosystem, Boneh-Franklin IBE, et cetera. Actually, now that I think about it, I may try to start a naming convention for crypto articles, because a lot of them are named badly. Anyway, that's why I moved the page. Mangojuice 02:06, 21 January 2006 (UTC)
Possible Mistake
[edit]Is the Big O function supposed to be "log log M" or is that just a typo? Is it supposed to be the log of the log of M? Zero 13:15, 12 September 2006 (UTC)
- Gotta be. If it were just the log of M, we would be looking at all the bits of the modulus. 209.210.225.6 03:59, 3 February 2007 (UTC)
Note on the unhelpfulness of the security proof
[edit]The security of the BBS generator is proven by means of a reduction, showing that an algorithm which successfully distinguishes a BBS output from a random string (equivalently, predicts the next bit) with non-negligible success can be used to factor the modulus. However, this `reduction' is very inefficient. In fact, as discussed in section 6.1 of N. Koblitz and A. Menezes, `Another look at ``provable security, II', http://eprint.iacr.org/2006/229, the current security reductions are almost completely useless. For currently-sensible key sizes (up to 3072 bits) they show security against an adversary who uses at most clock cycles. Yes, that was a minus sign. You've got to distinguish the bits from random bits in much less than one clock cycle!
Another warning. The current proof shows that bits are simultaneously hard-core, and can therefore be extracted at each iteration. There's a constant hidden in that , and seems to be less than one. Again, this is discussed in Koblitz and Menezes paper.
Summary: beware of asymptotic results. At least as far as security is concerned, they're much less useful than they look.
Mistake in the Example?
[edit]The page says that and should be primes, but in the example doesn't seem like a prime to me. Either there is a mistake in the example, or the description of the algorithm is somehow misleading. -- Samuli, 1 September 2006.
- There is yet another mistake in the example. For and the resulting bits would be b0b1...b5 = 1 0 0 0 0 0. Michael Prager 13:29, 15 November 2006 (UTC)
- The writer probably meant that is determined by taking the bit parity of . I tried to clarify this. -- Jitse Niesen (talk) 02:24, 16 November 2006 (UTC)
Math is Thick
[edit]If I want to understand the example, but have no idea what phi means. How may I go about finding what it means in context? It is un-googlable. —Preceding unsigned comment added by 70.189.96.232 (talk) 01:57, 26 February 2008 (UTC)
- A link on the symbol already directs its definition, i.e. the article on Euler's totient function. But, the link is indeed easy to overlook. 85.2.120.226 (talk) 06:22, 26 February 2008 (UTC)
- Neat. At the first mention. I see it now. I had actually landed on that page earlier via a list of symbols on the wiki but dismissed it because it looks different on that page.70.189.96.232 (talk) 23:45, 26 February 2008 (UTC)
Relic of Revisions?
[edit]At the end of the first paragraph in "Example":
"The following table shows different output bits of different bit is used to determine the output."
It might just be me, but as far as I can tell, that makes no sense. I for one am completely unable to figure out what it means. Was this a relic from someone's revisions of the section? Could someone who has an idea what it is meant to mean fix it up?
- I've just made an attempt to clarify this (and answer the question below). Wocky (talk) 01:39, 5 November 2009 (UTC)
But then who was S?
[edit]In the worked example p=11 q=19 and s=3. x-1 is set at s, what is s? if it's arbitary variable as p,q then why not just say an x-1 is picked. Elsewise how about a description of s?86.132.9.254 (talk) —Preceding undated comment added 21:09, 9 August 2009 (UTC).
- I would image that s stands for seed, as that’s the role it takes here. It’s slightly nicer (aesthetically) to say the algorithm is defined by the tuple (p,q,s) than (p,q,x−1). —Stephen Morley (talk) 07:20, 23 September 2009 (UTC)
Error in formula for calculating ?
[edit]The formula for calculating directly states that
However, in the original paper the formula is given as where is the Carmichael function.
Since M = pq, then , which is not always equal to (p-1)(q-1) (for example, when p = 5 and q = 7, (p-1)(q-1) = 24 but lcm(p-1, q-1) = 12).
Is there a problem with my math or is the article incorrect?
Thanks. Rapidflash (talk) 04:38, 10 November 2010 (UTC)
Inconsistent/confusing use of φ in summary/example
[edit]The summary says that p and q should be primes, and that "gcd(φ(p), φ(q)) should be small". Assuming this is correct, it's unnecessarily complicated, because Euler's_totient_function states that for prime p, φ(p) = p-1. Therefore, that could be written more clearly as "gcd(p - 1, q - 1)".
The example, however, refers "gcd(φ(p - 1), φ(q - 1))". I suspect that this was intended to be "gcd(p - 1, q - 1)", since that's consistent with the "gcd(φ(p), φ(q))" above. Stevie-O (talk) 18:57, 23 September 2019 (UTC)
no there was a mistake it's gcd( phi(p-1) , phi(q-1))--77.157.180.59 (talk) 09:40, 2 December 2019 (UTC)
Issues with cycle length
[edit]Hi all coming from the french page to seek for some extra-information
I have calculated cycle length for a variety of parameters,
it all looks like the ideal length cycle size is linear to M but for some reason, I get a lot of drops in the length cycle, for some M
so I only had a look at very small parameters M<1 000 000 but I looked all of them, for several seeds
it also happens that I have a drop in the length cycle for some seeds but this is less common I was wondering if I missed some criterias beyond those listed in the definition (2 primes in the 4k+3 form, with pgcd low, and the seed not dividble by p nor q, what else ?)
so I can't explain myself why the length cycle will drop for some peculiar M's although they seem to follow the same rules than the others, and less significant but still important why certain seeds behave badly isn't there some missing condition about I don't know the "crumbliness" of M
I could head to calculating higher cycles to see if the problem continues, but it will get harder and harder to calculate long cycles... so I won't get much further I fear
other idea is about the pgcd of p-1 and q-1. I sticked with pgcd = 2 and 6 that is the lower you can get. I clearly can see that when pgcd=2 it performs way better than pgcd=6 so maybe if I start working with higher number, and keep my pgcd as low as possible I'll improve the linear relation between M and the cycle's length ?? like I say in the discussion someone wondering if 2^4 was low enough ? mmmh I'd say no, like it's better to aim 2 not more, but is it that important ? regards
---EDIT---
here are a few examples I computed of RSA numbers (M) that seem to have very short cycles (less than 1000) on random seeds
124315108793 126682326977 132752243441 137778877289 138983498233 151990264421 141889257737 129159006553 130193764633 146808817637 150758144309 159121735673 175866224393 152536526057 171211769357 171272086913 171340623737 181763831057 167152279601 172160575777
for example
150758144309 that is 359291 * 419599 has a length cycle of 180 over BBS algorithm
so there must be some reason
for example
151848964097 that is 356819 * 425563 has a length cycle of 391860 which is still not good
but
151580636209 that is 356819 * 424811 has a length cycle of 4264260 which is far from extraordinary but yet better
EDIT n°2
it appears there was a small mistake on a condition it was gcd (phi(p-1), phi(q-1)) has to be small for the algorithm to be effective it raises another question about how to calculate effectively phi(p-1)
- Start-Class Cryptography articles
- High-importance Cryptography articles
- Start-Class Computer science articles
- High-importance Computer science articles
- WikiProject Computer science articles
- WikiProject Cryptography articles
- Start-Class Computing articles
- Low-importance Computing articles
- Start-Class software articles
- Low-importance software articles
- Start-Class software articles of Low-importance
- All Software articles
- Low-importance Computer science articles
- Start-Class Computer Security articles
- Mid-importance Computer Security articles
- Start-Class Computer Security articles of Mid-importance
- All Computer Security articles
- All Computing articles