Join PrimeGrid
Returning Participants
Community
Leader Boards
Results
Other
drummerslowrise

Message boards :
News :
PPS MEGA Prime of the Month!
Author 
Message 
Scott BrownVolunteer moderator Project administrator Volunteer tester Project scientist
Send message
Joined: 17 Oct 05 Posts: 1797 ID: 1178 Credit: 5,649,142,009 RAC: 3,037,522

On 1 February 2015, 16:49:00 UTC, PrimeGridâ€™s Mega Prime Search found the Mega Prime:
159*2^3425766+1
The prime is 1,031,261 digits long and enters Chris Caldwell's The Largest Known Primes Database ranked 107th overall.
The discovery was made by Evelyn Chew (Crackenback) of Australia using an Intel(R) Core(TM) i54670 CPU @ 3.40GHz with 8GB RAM running Microsoft Windows 7 Enterprise. This computer took about 1 hour 31 minutes to complete the primality test using LLR. Evelyn is a member of the BOINC@AUSTRALIA team.
The prime was verified on 1 February 2015, 19:11:39 UTC, by Matt Jurach (mattozan) of the United States using an Intel(R) Core(TM) i74500U CPU @ 1.80GHz with 16GB RAM running Microsoft Windows 8.1. This computer took about 6 hours and 9 minutes to complete the primality test using LLR. Matt is a member of the Aggie The Pew team.
For more details, please see the official announcement.
 

Scott BrownVolunteer moderator Project administrator Volunteer tester Project scientist
Send message
Joined: 17 Oct 05 Posts: 1797 ID: 1178 Credit: 5,649,142,009 RAC: 3,037,522

This was PrimeGrid's 4th mega prime discovery of 2015 and our 68th mega prime discovery overall. It also extends our streak of finding at least one mega prime every month which began back in 2013!
 


At what point will it be recognized that a million digits is just an arbitrary size?
Eventually, a "mega prime" will be found every week, then every day...
There's nothing intellectually unique or special about a "mega prime".
Why make special announcements for these numbers?
Eventually, once computers get faster, I will personally be able to discover a new mega prime every day.  

Scott BrownVolunteer moderator Project administrator Volunteer tester Project scientist
Send message
Joined: 17 Oct 05 Posts: 1797 ID: 1178 Credit: 5,649,142,009 RAC: 3,037,522

At what point will it be recognized that a million digits is just an arbitrary size?
Eventually, a "mega prime" will be found every week, then every day...
There's nothing intellectually unique or special about a "mega prime".
Why make special announcements for these numbers?
Eventually, once computers get faster, I will personally be able to discover a new mega prime every day.
Actually, I don't think that this will be true at all. We just happen to be at a juncture of reasonably fast computers and a particularly rich zone for finding these primes for certain projects. That is, the PPS Mega and SR5 projects are just over the 1 million digit threshold where such primes are the most common. Finds are considerably slower on the ESP and TRP projects that are now over the 2 million digit threshold. On other searches, the rarity is even greater (e.g., when was the last time you saw a Cullen or Woodall announcement?). As we continue to search larger and larger numbers I think that the increasing rarity will largely keep pace (and not infrequently exceed) the increases in computing speed.
As for why 1 million digits is somewhat special, an obvious answer is that there are only modestly more than 100 known. In ten years of searching, PG has found 68 of these, but that is an incredible amount of resources over a decade that found only 68. For the immediate future, I think these unusually large finds are worth a couple hundred word announcement on our news thread posts.
 

compositeVolunteer tester Send message
Joined: 16 Feb 10 Posts: 630 ID: 55391 Credit: 471,749,400 RAC: 150,133

Christopher Siegert wrote: At what point will it be recognized that a million digits is just an arbitrary size?
Eventually, a "mega prime" will be found every week, then every day...
There's nothing intellectually unique or special about a "mega prime".
Why make special announcements for these numbers?
Eventually, once computers get faster, I will personally be able to discover a new mega prime every day.
I concur with Scott. Prime numbers are naturally scarcer at larger size. The only ways to find large primes at a faster cadence are:
 by throwing more computing resources at them (and so exhausting the available primes sooner)
 algorithmic and programming advances that speed up computations (all help here is appreciated)
 inventing new algorithms to expose a different "class" of primes to testing in reasonable times. Certain mathematicians names have become household words at PrimeGrid this way.
 


Nonmega primes are much more rare. Asymptotically, only 0% of the prime are nonmega. In fact, more can be said because it has been mathematically proven that only a finite number of nonmega primes exist. As an example of these extraordinarily rare primes I will mention 101 (a palindrome and generalized Fermat). /JeppeSN  

compositeVolunteer tester Send message
Joined: 16 Feb 10 Posts: 630 ID: 55391 Credit: 471,749,400 RAC: 150,133

JeppeSN wrote: Nonmega primes are much more rare. Asymptotically, only 0% of the prime are nonmega. In fact, more can be said because it has been mathematically proven that only a finite number of nonmega primes exist. As an example of these extraordinarily rare primes I will mention 101 (a palindrome and generalized Fermat). /JeppeSN
Right you are. By "scarcer" I was really discussing density of primes.  


In ten years of searching, PG has found 68 of these, but that is an incredible amount of resources over a decade that found only 68.
Make that 69 with plenty more on the way.
It's called "exponential growth". It'll start off seeming quite slow, and then suddenly 1,000,000 digits won't even seem significant.
I'm sure there was a time when 350,000 digit primes seemed quite large, but now all of the top 5000 primes are at least that size. In fact, the first prime that I ever submitted to the top 5000 was initially ranked #792 with only 78,125 digits. That was in 2004.
So far, 59 days into 2015, five megaprimes have been found at PrimeGrid. At that rate, we can expect a total of 31 megaprimes for 2015.
I'd bet $100 that at least 50 megaprimes will be discovered in 2016 (not necessarily just at PrimeGrid though).
We just happen to be at a juncture of reasonably fast computers and a particularly rich zone for finding these primes for certain projects.
Given P = k*2^n +/ 1, there are millions of testable numbers for each sufficiently large integer n. So there's simply no shortage of numerical territory to cover.  

compositeVolunteer tester Send message
Joined: 16 Feb 10 Posts: 630 ID: 55391 Credit: 471,749,400 RAC: 150,133

Christopher Siegert wrote:
It's called "exponential growth". It'll start off seeming quite slow, and then suddenly 1,000,000 digits won't even seem significant.
I'm sure there was a time when 350,000 digit primes seemed quite large, but now all of the top 5000 primes are at least that size. In fact, the first prime that I ever submitted to the top 5000 was initially ranked #792 with only 78,125 digits. That was in 2004.
So far, 59 days into 2015, five megaprimes have been found at PrimeGrid. At that rate, we can expect a total of 31 megaprimes for 2015.
I'd bet $100 that at least 50 megaprimes will be discovered in 2016 (not necessarily just at PrimeGrid though).
We just happen to be at a juncture of reasonably fast computers and a particularly rich zone for finding these primes for certain projects.
Given P = k*2^n +/ 1, there are millions of testable numbers for each sufficiently large integer n. So there's simply no shortage of numerical territory to cover.
I think we're talking about the rate of prime finding, so now we have a debate. In the bigger picture the rate resembles an 'S'curve (with time on the X axis, and number of primes found on the Y axis), and when you look locally at the bottom of the curve it resembles an exponential curve. This is where we're at now. However, the rate will peak and ultimately decline in the long run.
There are a few things to consider for the rate of primefinding: the quantity of work to prove a given number is prime (a polynomial function of the number; proof "Primes is in P" by AKS, 2002) the density of primes (how many primes there are at a given size), the amount of work needed to find all primes at a given number of digits (which grows exponentially with number of digits), and the available computing resource (finite, a resourcelimited physical quantity). Recently, available computing resource has been growing at an exponential rate, but that won't last.  

Post to thread
Message boards :
News :
PPS MEGA Prime of the Month! 