## Other

drummers-lowrise
 Advanced search

Message boards : General discussion : Eratosthenes sieving?

 Subscribe SortOldest firstNewest firstHighest rated posts first
Author Message
Message 78410 - Posted: 28 Jul 2014 | 15:58:33 UTC

I'm crunching at PrimeGrid for a little over 4 years now and though I'm no mathematician at all, I've always tried to work through the forums about the math behind the project.

In a recent thread, Michael made very clear that only numbers of a certain form can be (easily) checked (by LRR, I guess) on primality. It's not feasible to check a random large number. As far as I understand, this results in ever larger primes, but also ever larger holes between discovered primes, where countless primes may be (are effectively?) hiding.

To find all those (very much) smaller primes, would Eratosthenes's Sieve be the only way? Could existing sieving software be 'easily' used to find 'every single prime', starting at the edge of the known list?

A complete list of all primes exists up to a certain number. Is someone out there extending this list by sieving, using all the lower primes?
If not, is this a feasible project, in order to get a complete list, not just quite random primes that are ever larger?
And if it is feasible, is it worthwhile?

Message 78414 - Posted: 28 Jul 2014 | 17:20:14 UTC
Last modified: 28 Jul 2014 | 17:29:41 UTC

The biggest problem is is that we (at least I) can't readily conceive what numbers such as "123456^2^20+1" actually mean.

If I asked you (either you personally or you collectively) to come up with a list of ALL primes under 1 million, you could probably figure out a way to do it.

We can grasp how big a million is. There's roughly 10 million people living within 25 miles of me. That's 8 digits.

There's about 7 billion people on the planet. I can grasp that. This is a 10 digit number.

But 123456^2^20+1 is 5,338,83 digits long. I can't even begin to imagine what such a number represents. Any measurement of the physical universe -- say, the total number of quarks in the entire universe -- probably only needs 100 digits. So 5 million digits is truly incomprehensible. It is perhaps literally as well as figuratively "unimaginable".

So when you talk about "all the prime numbers", you're not fully grasping the problem. That's not surprising, because you can't.

To put things into perspective, if you were to somehow generate a list of all the prime numbers up to one of "our" prime numbers, you couldn't store it anywhere because there probably aren't enough disk drives on Earth to store all those numbers.

The numbers we deal with are so incredibly large that even the thought of checking all numbers below that is utterly infeasible. It's not just problems such as, "my computer isn't fast enough". You run into problems like, "the universe isn't large enough."

If you're thinking of finding a comprehensive list of primes, you're probably limited to looking at primes of less than 10 digits. Perhaps 15 digits might be possible with a ridiculous amount of effort. 20 digits is almost certainly impossible.

The smallest of the numbers we're searching (on BOINC) is over 300,000 digits. 300,000 digit numbers are not 15,000 times larger than 20 digit numbers; they're 10^299980 times larger. To put that in perspective, consider the period at the end of this sentence. If you make that period 10^299980 times larger, it would be larger than the entire visible universe.
____________
My lucky number is 75898524288+1

Message 78418 - Posted: 28 Jul 2014 | 18:21:45 UTC

A megaprime is a prime number exceeding 10^999999.

Even if the average distance between neighboring primes is quite large near 10^999999, namely approximately 2.3026 × 10^6, that is 2.3 million, there are still 4.3430 × 10^999992 distinct primes that are too small to be megaprimes. If each one required only one atom to "write" on a list, we would still need more than 10^999892 universes to make the list. (I divided correctly by more than 10^100, a wild over-estimate of the number of atoms in one universe, but the number (quotient) still seems just as incomprehensible as before (as the dividend)!)

For the same reason a classical sieve to find all these numbers is absolutely impossible.

Testing huge numbers like that is only possible with numbers p of a very special form, that is numbers where we know from the outset that p-1 (or p+1) is composed of only small prime factors (like a couple of millions "copies" of the prime factor 2, and a few other primes). For "random" numbers of this size, the known methods for testing primality take much, much, much too long to be practical.

/JeppeSN

Message 78426 - Posted: 29 Jul 2014 | 8:37:46 UTC - in response to Message 78414.

So when you talk about "all the prime numbers", you're not fully grasping the problem. That's not surprising, because you can't.

I guess you got that right. While I technically knew the size of the numbers involved, I hadn't emotionally grabbed the sheer volume of the problem. I probably still don't. But your patient explanation, though sobering, made some of it clear.

I think I somewhere saw a comprehensive list of primes up tot 25 or 50 million. That's 8 digits, and not finished. To extend this list to 9 or 10 digits, was what I initially ment, but I understand now that this effort would be as significant as a single step en route to the other side of the universe. Even less so.

This brings me to a technical question. When sieving to remove numbers with factors, which (range of) primes are you (we) using to sieve with?

Message 78429 - Posted: 29 Jul 2014 | 11:43:19 UTC - in response to Message 78426.
Last modified: 29 Jul 2014 | 11:43:52 UTC

This brings me to a technical question. When sieving to remove numbers with factors, which (range of) primes are you (we) using to sieve with?

Good question!

Here's the (almost) full answer:

PSA manual sieving:

PPR: k * 2 ^ n +/- 1, 2<k<10000, 9M<n,12M (roughly 2.7 million digits to 3.6 million digits)

GFN n=18: b ^ 262144+1, 2<=b<=100M (up to 2.1 million digits)
GFN n=19: b ^ 524288+1, 2<=b<=100M (up to 4.2 million digits)
GFN n=20: b ^ 1048576+1, 2<=b<=100M (up to 8.4 million digits)
GFN n=21: b ^ 2097152+1, 2<=b<=100M (up to 16.8 million digits)
GFN n=22: b ^ 4194304+1, 2<=b<=100M (up to 33.5 million digits)

GCW (GC 13) : n * 13 ^ n + 1, 500K < n <= 1M (between 0.5 million digits and 1.1 million digits)
GCW (GC 25): n * 25 ^ n + 1, 500K < n <= 1M (between 0.7 million digits and 1.4 million digits)

Factorial and Primorial: I'm not really sure. I'd have to dig around to answer this, but they're likely much smaller numbers because the numbers we test are much smaller.

BOINC:

PPS Sieve: k * 2 ^ n +/- 1, 5<=k<10000, 6M<n,9M (roughly 1.8 million digits to 2.7 million digits)

PSP/SoB/ESP Sieve: k * 2 ^n + 1, 10M<n<50M (3 million digits to 15 million digits)
(Currently searching the 6 remaining k's in the ESP problem)

TRP Sieve: Not sure, but probably similar to the PSP/SoB/ESP sieve.

In theory, if we were to sieve to "full" depth, i.e. the square root of the numbers being searched, which is roughly half the number of digits, the sieve itself would tell you every single prime. Of course, you still have the "the universe isn't large enough and won't last long enough" problem. For example, GFN n=22 is currently sieved to 104400P. That's 1.044e+20, a 21 digit number. To fully sieve it to the point where everything left is guaranteed to be prime, we'd have to increase the sieve depth from a 21 digit number to a 16 million digit number. Again, that's one of those incomprehensibly large numbers you just can't really wrap your head around. (And the sieving software would break long before then. The sun would probably die before the software breaks, however, so we don't really have to worry about the software!)

For me, I tend to think of anything between 20 and 100 digits as being "as big as the universe." Anything larger than 100 digits I just consider to be "infinitely large" for all practical purposes.

Actually, if you're talking about global data storage capacity, perhaps numbers with as many as 30 digits are meaningful. So 20 to 30 digits might fall in a "as big as the Internet" range.
____________
My lucky number is 75898524288+1

Message 78433 - Posted: 30 Jul 2014 | 0:48:15 UTC

This is a sobering, and fun, reminder of both the magnitude of the science and the numbers themselves. To think that my tiny little laptop (not to mention my i7) could address problems of this sort is astonishing. Thanks to the s/w authors, h/w manufacturers, and everyone in between. And the community!

The number of elementary particles in the universe likely doesn't exceed 10^85 (cite: quick web search). It certainly doesn't approach 10^100 (informally, a "googol"; note spelling). And yet the numbers we test here are vastly larger than this. Now if you want to read about a really big named number, check out Graham's Number. Remember, that number is tiny compared to the vast majority still out there... :-)

--Gary

Message 78436 - Posted: 30 Jul 2014 | 4:29:19 UTC - in response to Message 78426.

This brings me to a technical question. When sieving to remove numbers with factors, which (range of) primes are you (we) using to sieve with?

One of the other useful properties of these numbers is that factors can only exist in predetermined forms. Mersenne primes (2^p -1), for example all have factors of the form 2*k*p+1; 1<k<2^[(p+1)/2]. Further restrictions can be made on k and these, possibly composite, k values are used to determine prime factors of the above form. Generally we don't care about the actual prime but the k which generates it (computer science/number thoery functionality/perspective I believe).

As for other forms other users could give better answers. Same goes for restrictions on k.

Here is a php script for factoring anything below 46 digits which would give primes up to 23. I don't know how complete this list is.

http://www.mersenne.ca/factor.php

Message 78437 - Posted: 30 Jul 2014 | 6:54:03 UTC - in response to Message 78429.

This brings me to a technical question. When sieving to remove numbers with factors, which (range of) primes are you (we) using to sieve with?

Good question!

Here's the (almost) full answer:

PSA manual sieving:

PPR: k * 2 ^ n +/- 1, 2<k<10000, 9M<n,12M (roughly 2.7 million digits to 3.6 million digits)

GFN n=18: b ^ 262144+1, 2<=b<=100M (up to 2.1 million digits)
GFN n=19: b ^ 524288+1, 2<=b<=100M (up to 4.2 million digits)
GFN n=20: b ^ 1048576+1, 2<=b<=100M (up to 8.4 million digits)
GFN n=21: b ^ 2097152+1, 2<=b<=100M (up to 16.8 million digits)
GFN n=22: b ^ 4194304+1, 2<=b<=100M (up to 33.5 million digits)

GCW (GC 13) : n * 13 ^ n + 1, 500K < n <= 1M (between 0.5 million digits and 1.1 million digits)
GCW (GC 25): n * 25 ^ n + 1, 500K < n <= 1M (between 0.7 million digits and 1.4 million digits)

Factorial and Primorial: I'm not really sure. I'd have to dig around to answer this, but they're likely much smaller numbers because the numbers we test are much smaller.

BOINC:

PPS Sieve: k * 2 ^ n +/- 1, 5<=k<10000, 6M<n,9M (roughly 1.8 million digits to 2.7 million digits)

PSP/SoB/ESP Sieve: k * 2 ^n + 1, 10M<n<50M (3 million digits to 15 million digits)
(Currently searching the 6 remaining k's in the ESP problem)

TRP Sieve: Not sure, but probably similar to the PSP/SoB/ESP sieve.

I don't think you have gotten his question right. I believe he was asking about the sieve depth of each of these sieves.

Message 78440 - Posted: 30 Jul 2014 | 11:11:43 UTC - in response to Message 78437.

I don't think you have gotten his question right. I believe he was asking about the sieve depth of each of these sieves.

That's an easier question, and one actually answerable by users themselves if they do a little digging. The deepest sieve is probably GFN n=22, which is 21 digits.
____________
My lucky number is 75898524288+1

Message boards : General discussion : Eratosthenes sieving?

[Return to PrimeGrid main page]
DNS Powered by DNSEXIT.COM
Copyright © 2005 - 2021 Rytis Slatkevičius (contact) and PrimeGrid community. Server load 3.41, 4.06, 3.98
Generated 15 Jun 2021 | 7:41:52 UTC