Join PrimeGrid
Returning Participants
Community
Leader Boards
Results
Other
drummerslowrise

Message boards :
Generalized Fermat Prime Search :
EFF offering $150,000 prize for 1st 100M digit prime
Author 
Message 
GDBSend message
Joined: 15 Nov 11 Posts: 252 ID: 119185 Credit: 2,742,844,710 RAC: 2,116,737

Since EFF is offering $150,000 prize for the 1st 100M digit prime, any chance of GFN22 starting a 100M digit search soon?  

Michael GoetzVolunteer moderator Project administrator
Send message
Joined: 21 Jan 10 Posts: 13620 ID: 53948 Credit: 266,787,098 RAC: 299,875

Since EFF is offering $150,000 prize for the 1st 100M digit prime, any chance of GFN22 starting a 100M digit search soon?
You might want to do some math and figure out how large b would need to be such that b^2^22+1 has more than 100 million digits. Keep in mind that Genefer won't be able to search beyond b=400,000,000. That's only 36 million digits.
____________
My lucky number is 75898^{524288}+1  

RafaelVolunteer tester
Send message
Joined: 22 Oct 14 Posts: 899 ID: 370496 Credit: 371,131,833 RAC: 113,700

Since EFF is offering $150,000 prize for the 1st 100M digit prime, any chance of GFN22 starting a 100M digit search soon?
You might want to do some math and figure out how large b would need to be such that b^2^22+1 has more than 100 million digits. Keep in mind that Genefer won't be able to search beyond b=400,000,000. That's only 36 million digits.
Also, n limited to 23, if I recall correctly. I remember OCL beta testing, an I think 23 was the max the program could handle (for whatever reason).  


You might want to do some math and
If
b^(2^n) > 10^{99'999'999}
then
b > 10^{99'999'999 / 2^n}
We can create a table by plugging in:
n required b
22 694796579498971371357022.9
23 833544587589.0
24 912986.6
25 955.5
26 30.9
/JeppeSN
 

GDBSend message
Joined: 15 Nov 11 Posts: 252 ID: 119185 Credit: 2,742,844,710 RAC: 2,116,737

Do we know what is the max b GFN software can handle?
If the max b is something like 25, 26, or 27? Is it conceivable that we might start a 100M digit GFN within a year or two, and wouldn't we need to begin a manual sieve soon?  

HonzaVolunteer moderator Volunteer tester Project scientist Send message
Joined: 15 Aug 05 Posts: 1902 ID: 352 Credit: 3,597,086,456 RAC: 5,375,587

Do we know what is the max b GFN software can handle?
If the max b is something like 25, 26, or 27? Is it conceivable that we might start a 100M digit GFN within a year or two, and wouldn't we need to begin a manual sieve soon?
In my opinion, this is far from reasonable. Let me elaborate...
There would be memory limit as well.
I can't remember exact numbers but today's GPUs are not up to the job.
Nor memory, nor performance...nor patience.
If n=21 now takes around single day to perform a test, and n=22 almost 4 (with pretty high b from days where n=22 was WR so there was extra incentive),
n=23 would be 16 days
n=24 ~64 days
n=25 ~256 days
n=26 ~1000 days
and n=27 around 10 years for a single test (!).
Almost all GPUs would die within 10 years of 24/7 running.
This is far from reasonable or sane.
Since we haven't found a single prime for n=20 yet (it would be Top12, n=21 Top6 and n=22 second largest know prime), there might be a problem finding any prime for n=24 and what about n=27 during next decade?
Let try to find any prime for n=20,21,22 in the meantime.
On a side note, being a bit more realistic, I'm pushing for GFNMega. There should be one about each b~250k based on historical finds.
____________
My stats
Badge score: 1*1 + 5*1 + 8*3 + 9*11 + 10*1 + 11*1 + 12*3 = 186  

GDBSend message
Joined: 15 Nov 11 Posts: 252 ID: 119185 Credit: 2,742,844,710 RAC: 2,116,737

Unless it was possible to split a 1000 day GFN run. into 1000 1day GFN runs and merge the results?  

NeoVolunteer tester
Send message
Joined: 28 Oct 10 Posts: 710 ID: 71509 Credit: 91,178,992 RAC: 0

Unless it was possible to split a 1000 day GFN run. into 1000 1day GFN runs and merge the results?
GDB,
I like your determination.
Neo  

Michael GoetzVolunteer moderator Project administrator
Send message
Joined: 21 Jan 10 Posts: 13620 ID: 53948 Credit: 266,787,098 RAC: 299,875

Unless it was possible to split a 1000 day GFN run. into 1000 1day GFN runs and merge the results?
Yes, but not in the way you meant, and not in a useful way.
It's technically possible to do what you're asking  but they can't be run in parallel and then merged. The 1000 tasks need to be run sequentially, with the output of each task being uploaded from one task to the server and then downloaded to the next task. Essentially, it's the same as when you suspend a task, the state is saved in the checkpoint file, and then you restart from the checkpoint. Except now we're restarting the computation on another user's computer.
Bear in mind that the checkpoint file for a theoretical n=26 would be about 640 MB.
Even if we wanted to do this (it's utterly ridiculous and makes the process slower, not faster, since you need to wait for wingmen at each step), the server doesn't have that kind of disk space and we can't afford the bandwidth.
____________
My lucky number is 75898^{524288}+1  

Michael GoetzVolunteer moderator Project administrator
Send message
Joined: 21 Jan 10 Posts: 13620 ID: 53948 Credit: 266,787,098 RAC: 299,875

GDB,
When GIMPS surpassed our GFN22 search, meaning any prime we found in GFN22 would fail to be the new largest known prime number, we investigated opening up GFN23 so we could continue to search for the world's largest prime.
It was decided that was impractical at this time.
If searching n=23 for a prime in the high 20 million digit range was deemed impractical, then searching even higher n values for the 100 million digit prime is exponentially more impractical.
Unless something very significant changes with regard to either hardware or software, I don't expect us to be searching for a 100M digit prime any time soon. "Soon" in this context, means the next decade or two. Bear in mind that we've been searching n=22 for over 5 years and we're only now begin to talk about whether n=23 is reasonable. Realistically speaking, a 100M digit search would be run at n=25. We're a long ways from running an n=25 search.
____________
My lucky number is 75898^{524288}+1  


The most likely candidate for the award mentioned, is some Mersenne prime found by the Great Internet Mersenne Prime Search (GIMPS), just like the two previous prizes. It will take a long time before they reach the relevant sizes. But it will most probably take even longer before other projects can. GIMPS has special rules for how they share the money when (if) they can claim the next EFF award. /JeppeSN  

GDBSend message
Joined: 15 Nov 11 Posts: 252 ID: 119185 Credit: 2,742,844,710 RAC: 2,116,737

I'd probably put my money on the software being modified to either speedup or allow splitting over the next 10 years. Is there a link to the GFN software source somewhere?  

HonzaVolunteer moderator Volunteer tester Project scientist Send message
Joined: 15 Aug 05 Posts: 1902 ID: 352 Credit: 3,597,086,456 RAC: 5,375,587

I'd probably put my money on the software being modified to either speedup or allow splitting over the next 10 years. Is there a link to the GFN software source somewhere?
In Source code repositories, try Genefer link.
____________
My stats
Badge score: 1*1 + 5*1 + 8*3 + 9*11 + 10*1 + 11*1 + 12*3 = 186  


Hi everyone.
Will quantum computers ever help us find larger and larger prime numbers? Or are they only useful at factoring numbers?
Thanks  

axnVolunteer developer Send message
Joined: 29 Dec 07 Posts: 285 ID: 16874 Credit: 28,027,106 RAC: 0

Realistically speaking, a 100M digit search would be run at n=25.
N=25 is _too big_ for a 100M search. N=24 would be the optimal. You have the huge range of 912986 < b <= 400m, which ranges from 100m digits to about 144m digits. N=25 will double the sizes and therefore make it 8 times harder to find a prime (except for a small patch at the beginning).  

Michael GoetzVolunteer moderator Project administrator
Send message
Joined: 21 Jan 10 Posts: 13620 ID: 53948 Credit: 266,787,098 RAC: 299,875

Realistically speaking, a 100M digit search would be run at n=25.
N=25 is _too big_ for a 100M search. N=24 would be the optimal. You have the huge range of 912986 < b <= 400m, which ranges from 100m digits to about 144m digits. N=25 will double the sizes and therefore make it 8 times harder to find a prime (except for a small patch at the beginning).
True, I was assuming we'd run a normal search starting at the bottom of the b range. It could be done as you describe. Of course, if n=23 is impractical today, then n=24 is also impractical. Furthermore, running n=24 at such a high b would necessitate running one of the slower transforms.
Today, the software doesn't support running tests that large. Could it be changed to do so? Maybe. If the software supported those tests, could we run a 100M digit search? Yes. There's nothing stopping us, as a project, from doing so. Would we want to? Probably not. N=22 itself was a longshot, with a much more mundane goal, by comparison: Finding the worlds largest known prime number. We still haven't found an n=22. Nor an n=21. Nor an n=20. There's no proof any GFN primes exist that are that large. We would be searching for something that might not exist, and even if it does, will be exceptionally difficult to find. Chances are the energy costs for running the computers involved in that search will exceed the prize money long before a prime is found.
____________
My lucky number is 75898^{524288}+1  


Of course when(if) GFN23 starts, I'll try it. But it reminds me standart FN33 number (over 2500000000 digits) in some way. I even can't imagine what to do with this.  

GDBSend message
Joined: 15 Nov 11 Posts: 252 ID: 119185 Credit: 2,742,844,710 RAC: 2,116,737

Since we might want to run GFN23 within 510 years, doesn't that mean we should start running N=23 manual sieves now? So, we can have a decent amount of sieving done before we start running GFN23?  

JimBHonorary cruncher Send message
Joined: 4 Aug 11 Posts: 916 ID: 107307 Credit: 974,532,191 RAC: 2

Since we might want to run GFN23 within 510 years, doesn't that mean we should start running N=23 manual sieves now? So, we can have a decent amount of sieving done before we start running GFN23?
There are no plans to run GFN 23 that I know of. So we're not going to be sieving it.  

GDBSend message
Joined: 15 Nov 11 Posts: 252 ID: 119185 Credit: 2,742,844,710 RAC: 2,116,737

Since N=23 sieve isn't currently feasible, and N=21 sieve is close to being shutoff, why not reopen poorly sieved N=22?  

RafaelVolunteer tester
Send message
Joined: 22 Oct 14 Posts: 899 ID: 370496 Credit: 371,131,833 RAC: 113,700

Since N=23 sieve isn't currently feasible, and N=21 sieve is close to being shutoff, why not reopen poorly sieved N=22?
Nov 1 2017, By JimB
Every six months I reevaluate the sieving levels, GFN16 will almost certainly reopen at that time. GFN19, GFN20 and GFN22 certainly will. For purposes of the computation, I look at what we'll check with genefer over the next five years. I think that makes a lot of sense, especially now that we have to appeal to users for money to keep the servers hosted.
 

Message boards :
Generalized Fermat Prime Search :
EFF offering $150,000 prize for 1st 100M digit prime 