Join PrimeGrid
Returning Participants
Community
Leader Boards
Results
Other
drummerslowrise

Message boards :
Cullen/Woodall prime search :
Are C/W primes more rare than Proth?
Author 
Message 
BurVolunteer tester
Send message
Joined: 25 Feb 20 Posts: 515 ID: 1241833 Credit: 415,342,448 RAC: 18,445

At fist glance Cullen or Woodall primes seem to be quite rare compared to Proth primes. Is it just because they are special cases of Proth primes? Or is the prime density actually lower?
If we say k <= n, then the amount of Proth numbers in range 0...n is n², correct? And we only have n Cullen or Woodall numbers in that range. So there would be square root amount of C/W primes compared to Proth primes?
For n,k < 100, there are 433 Proth primes, while there are only 10 Cullen and only 6 Woodall primes instead of 433^1/2 = 21. Is my reasoning wrong or is larger n required or is the density really much lower? If the latter, does anyone know why?  


At fist glance Cullen or Woodall primes seem to be quite rare compared to Proth primes. Is it just because they are special cases of Proth primes? Or is the prime density actually lower?
I suppose it's the former, someone correct me!
____________
My lucky number is 6219*2^3374198+1
 

Yves GallotVolunteer developer Project scientist Send message
Joined: 19 Aug 12 Posts: 842 ID: 164101 Credit: 306,511,698 RAC: 5,382

If we say k <= n, then the amount of Proth numbers in range 0...n is n², correct?
n² / 2 because k is odd.
For n, k < 100, there are 433 Proth primes, while there are only 10 Cullen and only 6 Woodall primes
The density of Proth primes is 433 / (100²/2) = 8.66%.
The density of Cullen primes is 1 / 100.
The density of Woodall primes is 6 / 100.
In https://www.ams.org/journals/mcom/199564212/S00255718199513084563/ some divisibility properties are explained. Because of them there are few small Cullen primes. But for large n, the density may be close to the density of Proth primes.
If Proth numbers are sieved, they are similar to random odd numbers: 3 removes 1/3 of them, 5 1/5 of them, etc. This is no different with Cullen or Woodall numbers.  

Yves GallotVolunteer developer Project scientist Send message
Joined: 19 Aug 12 Posts: 842 ID: 164101 Credit: 306,511,698 RAC: 5,382

Going further, we can use the Proth primes found by PrimeGrid and the previous searches.
Let n <= 1,500,000.
Proth: for 1 < k <= 9999 (4999 k's), 168901 primes were found.
The ratio is 33.8 primes/k. We can compute the mean weight in this range and we have weight = 2.002 (2 <=> random odd numbers).
In this range we have 14 Cullen primes and 30 Woodall primes.
We can estimate the weights with these numbers of primes:
Integrate(1/log(n * 2^n), n <= 1,500,000) ~ 19.9
Cullen weight ~ 14 / 19.9 = 0.7
Woodall weight ~ 30 / 19.9 = 1.5
We have a better estimate with the current limit: n <= 18M
16 Cullen primes and 34 Woodall primes.
Integrate(1/log(n * 2^n), n <= 18M) ~ 23.5
Cullen weight ~ 16 / 23.5 = 0.68
Woodall weight ~ 34 / 23.5 = 1.4
The formula indicated in Weight can also be applied to compute the number of expected primes.
With p_max = 10,000,000 and 10,000 n's, we have:
Estimated Cullen weight = 0.715
Estimated Woodall weight = 1.56
This means that the number of primes is related to the sieving process (many candidates are removed at this stage) and not to an unknown algebraic factorization.
Then the expected number of Proth primes for k < K and n < N is
K/2 * 2 * log_2(N) = K * log_2(N).
And the expected numbers of Cullen/Woodall primes are
Cullen: 0.7 * log_2(N)
Woodall: 1.5 * log_2(N)
I don't know if the divisibility properties of C/W_(p+1)/2 and C/W_(3p1)/2 are sufficient to explain the values of the weight.  

BurVolunteer tester
Send message
Joined: 25 Feb 20 Posts: 515 ID: 1241833 Credit: 415,342,448 RAC: 18,445

n²/2, right, thanks.
So the density is approximately 9%, 1%, 6% when looking at just 100 numbers. That doesn't say much either way, to be honest. I had to manually count the Proth primes, that's why I only checked up to n = 100.
Because of them there are few small Cullen primes. But for large n, the density may be close to the density of Proth primes. That's interesting. So even for a larger sample size it might yield low values for C/W.
Mersenne, Thabit, Sierpinski, Riesel, Proth, Fermat, they are basically all the same (chosen for their relative ease to prove primality?), just with different restrictions about k and n and b. Do they all have the same prime density for large enough k, n, b? Are there other types of primes that have, either proven or conjectured, a different prime density?
Or, more generally, are there other types of primes at all? I mean "are" in the sense of being studied.  

Ravi FernandoProject administrator Volunteer tester Project scientist Send message
Joined: 21 Mar 19 Posts: 211 ID: 1108183 Credit: 14,457,450 RAC: 4,270

If Proth numbers are sieved, they are similar to random odd numbers: 3 removes 1/3 of them, 5 1/5 of them, etc. This is no different with Cullen or Woodall numbers.
Truebut the conditions for divisibility by different primes aren't independent. I would expect this to account for the different weights of the two sequences.  


Mersenne, Thabit, Sierpinski, Riesel, Proth, Fermat, they are basically all the same (chosen for their relative ease to prove primality?), just with different restrictions about k and n and b. Do they all have the same prime density for large enough k, n, b?
[i]Similar density, very similar but never the same.
Are there other types of primes that have, either proven or conjectured, a different prime density?
I suppose Factorial, Primorial, and the WWWW projects. Also the Phi entries on top20 at T5K. Dunno what they're called.
Or, more generally, are there other types of primes at all? I mean "are" in the sense of being studied.
Same as above.
____________
My lucky number is 6219*2^3374198+1
 

Yves GallotVolunteer developer Project scientist Send message
Joined: 19 Aug 12 Posts: 842 ID: 164101 Credit: 306,511,698 RAC: 5,382

If Proth numbers are sieved, they are similar to random odd numbers: 3 removes 1/3 of them, 5 1/5 of them, etc. This is no different with Cullen or Woodall numbers.
Truebut the conditions for divisibility by different primes aren't independent. I would expect this to account for the different weights of the two sequences.
This is not an assumption but the results of sieving. It seems that Cullen or Woodall numbers are very different from Proth numbers with fixed k. Any odd prime p removes 1/p of them. The weight is low (< 2) but this is not because of a particular set of primes but because of some relations like if (2/p) = 1 then p  C_{(p + 1)/2} and if (2/p) = +1 then p  C_{(3p  1)/2}.
It is possible that the weight of Cullen or Woodall numbers is 2. When n is "small", many primes p remove some Cullen numbers (with the two relations). But when n is "large", almost all numbers are composite and the relations remove few Cullen numbers. Then the low density of Cullen or Woodall numbers may ultimately be false.  

Yves GallotVolunteer developer Project scientist Send message
Joined: 19 Aug 12 Posts: 842 ID: 164101 Credit: 306,511,698 RAC: 5,382

Mersenne, Thabit, Sierpinski, Riesel, Proth, Fermat, they are basically all the same (chosen for their relative ease to prove primality?), just with different restrictions about k and n and b. Do they all have the same prime density for large enough k, n, b? Are there other types of primes that have, either proven or conjectured, a different prime density?
Or, more generally, are there other types of primes at all? I mean "are" in the sense of being studied.
The density of prime numbers in a sequence is an open problem.
The density of primes of the form a + n·b is known, this is Dirichlet's theorem on arithmetic progressions.
We know that the density can be 0 (Sierpiński numbers: 78557·2^{n} + 1, etc.) or 1 (Mills primes: [A^{3n}]).
Generally, the number of primes in a sequence is a conjecture (Where is the Next Mersenne Prime?, see probabilistic arguments for the finitude of Fermat primes, etc.).
The density is different for each sequence and we check these conjectures with the prime number search.
For polynomials, Bateman–Horn conjecture defines the prime density. It can be tested with the Generalized Fermat Numbers b^{2n} + 1, for fixed n and a large range of b's. Today, no deviation is known and the conjecture is expected to be true.
Exponentials are much more complicated but some conjectures exist (Mersenne primes, k·2^{n} + 1 for fixed k, etc.). This is more difficult to check them because of the exponential growth.
From my point of view any type of primes is interesting if it is related to a conjecture. Number theory is a science: we improve our knowledge from numerous experiments.
 


[...] It seems that Cullen or Woodall numbers are very different from Proth numbers with fixed k. Any odd prime p removes 1/p of them. [...]
Let me fix the odd prime to 23 to see what happens.
Fix some k. Consider k*2^n + 1 for n = 1,2,3,...
If 23 divides k, clearly k*2^n + 1 is never divisible by 23.
Otherwise, we must check the order of 2 modulo 23. It turns out to be 11. Therefore, whether k*2^n + 1 is divisible by 23 or not (k is fixed) depends only on n modulo 11.
Modulo 23, there is a unique solution x to the congruence k*x + 1 == 0 (remember, k is not 0 modulo 23). So this case splits in two: Either that x is one of the eleven powers of 2, and then 23 divides 1/11 of the terms in the sequence k*2^n + 1 for n = 1,2,3,... Or else that x is not a power of 2 mod 23, and then k*2^n + 1 is never divisible by 23.
Now, still with p=23, let us turn to Cullen numbers. That is, n*2^n + 1 for n = 1,2,3,...
The "front" factor n repeats modulo 23, every 23rd term. The other factor, 2^n, still repeats every 11th term. So the total pattern repeat every 23*11 = 253. But for each of the 11 "essential" choices of the exponent (mod 11), there is going to be exactly one choice (mod 23) of the front factor. So this p, 23, divides 11/253 = 1/23 of the numbers n*2^n + 1.
Ah, now I understand what you were saying!
For Proth, we saw that either none (if 23 divides k, or if the relevant inverse is not in the multiplicative subgroup generated by 2) or 1/11 of the candidates are removed by 23. For Cullen, it is simply 1/23 of the candidates that are removed by 23.
Addition: For Proth, 11/23 of k values have the property that prime 23 divides 1/11 of all terms, and the remaining k values never get anything removed when sieving with 23. So the weighted average over all k is still that 23 divides 1/23 of terms in Proth.
/JeppeSN  

Yves GallotVolunteer developer Project scientist Send message
Joined: 19 Aug 12 Posts: 842 ID: 164101 Credit: 306,511,698 RAC: 5,382

Ah, now I understand what you were saying!
In fact you have done better, you proved it!
But as Ravi has pointed out, the next question is the dependency for divisibility by different primes.
Surprisingly they may be independent: with the pair
 (3, 5) the ratio is 1/3 + 1/5  1/15
 (3, 7) the ratio is 1/3 + 1/7  1/21
 (5, 7) the ratio is 1/5 + 1/7  1/35
 ...???
The proof is left to the reader :o)
 

BurVolunteer tester
Send message
Joined: 25 Feb 20 Posts: 515 ID: 1241833 Credit: 415,342,448 RAC: 18,445

weight = 2.002 (2 <=> random odd numbers). That means the weight of a series of random odd numbers is 2? So Proth numbers behave no different then any random odd number when it comes to producing primes?
In this range we have 14 Cullen primes and 30 Woodall primes.
We can estimate the weights with these numbers of primes:
Integrate(1/log(n * 2^n), n <= 1,500,000) ~ 19.9
Cullen weight ~ 14 / 19.9 = 0.7
Woodall weight ~ 30 / 19.9 = 1.5 What is the 19.9? The amount of primes expected to be found in a random sequence that size? So the weight tells how close we are to the expected number? Odd numbers have weight=2 since they are twice as likely to produce a prime than odd+even numbers?
Cullen: 0.7 * log_2(N)
Woodall: 1.5 * log_2(N) What does that mean? Woodall numbers are twice as likely to be prime than Cullen? But why have Proth numbers k * log_2(N)? k can be quite large, I don't think the prime density increases that strongly with large k? I probably don't grasp this properly...
I suppose Factorial, Primorial, and the WWWW projects. Also the Phi entries on top20 at T5K. Dunno what they're called. What is WWWW, I couldn't find anything for some reason. Those "Phi primes" are apparently called "unique primes" as their reciprocal 1/p has a unique period in decimal representation. Period of 1 as for 1/3 is unique, but 1/353 has a period of 32 for example which is long but not unique. T5K has a short explanation on "generalized uniques". It's fascinating that apparently it's possible to prove a prime is unique in that sense, but not possible to determine the length of the period.
I never heard of Mill's primes before, fascinating as well. :) Wouldn't that make a nice subproject? It could produce a lot of large primes. On the other hand that sounds too easy, so there's probably a catch? Is the exact value of A not known and the deviation becomes too large for large n? Or is it too difficult to test for primality of these numbers?
Same goes for "unique primes", sounds like a very interesting subproject. But maybe again the required calculations are too slow?  

Ravi FernandoProject administrator Volunteer tester Project scientist Send message
Joined: 21 Mar 19 Posts: 211 ID: 1108183 Credit: 14,457,450 RAC: 4,270

This is not an assumption but the results of sieving. It seems that Cullen or Woodall numbers are very different from Proth numbers with fixed k. Any odd prime p removes 1/p of them.
Ah, I think I misunderstood you before. You're right: the conditions are independent, so each odd p removes exactly 1/p of the remaining candidates (not just 1/p of all candidates).
The weight is low (< 2) but this is not because of a particular set of primes but because of some relations like if (2/p) = 1 then p  C_{(p + 1)/2} and if (2/p) = +1 then p  C_{(3p  1)/2}.
Strange. A few more of these:
If p is odd, then p  C_{p1}.
If p is 1 mod 4 and 2 is a fourth power mod p, then p  C_{(9p1)/4}.
If p is 1 mod 4 and 2 is a square but not a fourth power mod p, then p  C_{(7p+1)/4}.
I suppose one could keep going with this: if p is 1 mod 8 and 2 is an eighth power mod p, then (...) . (Are there other generalizations?) But the assumptions that must be imposed get increasingly uncommon.
You may have done this already, but it might be worthwhile to recalculate the weight with p_max << n_max, to see how much the weight is affected by primes on the order of n.  

Ravi FernandoProject administrator Volunteer tester Project scientist Send message
Joined: 21 Mar 19 Posts: 211 ID: 1108183 Credit: 14,457,450 RAC: 4,270

But as Ravi has pointed out, the next question is the dependency for divisibility by different primes.
I was wrongthey're always independent. Proof idea: say we're given primes p1 < ... < pk. The probability that p1 divides a given C_n is 1/p1, and this depends only on the residues of n modulo p1 and modulo (p1  1). The probability that p2 divides C_n is 1/p2, and this is still true conditional on any congruence conditions on n modulo things coprime to p2. (JeppeSN's post explains this in the case p1 = 11, p2 = 23.) And so on. Since we arranged the primes in increasing order, no new prime can divide any previous prime minus one.
Surprisingly they may be independent: with the pair
 (3, 5) the ratio is 1/3 + 1/5  1/15
 (3, 7) the ratio is 1/3 + 1/7  1/21
 (5, 7) the ratio is 1/5 + 1/7  1/35
 ...???
As they should be: (1  1/p) (1  1/q) = 1  (1/p + 1/q  1/pq).  

Yves GallotVolunteer developer Project scientist Send message
Joined: 19 Aug 12 Posts: 842 ID: 164101 Credit: 306,511,698 RAC: 5,382

weight = 2.002 (2 <=> random odd numbers). That means the weight of a series of random odd numbers is 2? So Proth numbers behave no different then any random odd number when it comes to producing primes?
That's right, for a large set of k's.
For fixed k, the number of primes differs (see Riesel's list).
In this range we have 14 Cullen primes and 30 Woodall primes.
We can estimate the weights with these numbers of primes:
Integrate(1/log(n * 2^n), n <= 1,500,000) ~ 19.9
Cullen weight ~ 14 / 19.9 = 0.7
Woodall weight ~ 30 / 19.9 = 1.5 What is the 19.9? The amount of primes expected to be found in a random sequence that size? So the weight tells how close we are to the expected number? Odd numbers have weight=2 since they are twice as likely to produce a prime than odd+even numbers?
That's correct.
Cullen: 0.7 * log_2(N)
Woodall: 1.5 * log_2(N) What does that mean? Woodall numbers are twice as likely to be prime than Cullen?
In the range 1 <= n <= 18,000,000, 16 Cullen primes and 34 Woodall primes were found.
The amount of primes expected to be found in that range with a random sequence is 23.5.
Then the weights can be estimated W_c = 16 / 23.5 = 0.68 and W_w = 34 / 23.5 = 1.45
But why have Proth numbers k * log_2(N)?
Not k * log_2(N) but W_k * log_2(N).
For a random sequence, the expected number of primes is
Sum(1 / log(k * 2^n + 1), 1 <= n <= N) ~= log_2((N + log_2(k)) / (1 + log_2(k))) ~ log_2(N).
For fixed k, the number of primes is different but it is conjectured to be W_k * log_2(N), where the weight is a constant. W_k can be estimated with a sieve: see Weight.js.
Now the question of this thread is what about the density of Cullen and Woodall primes?
Surprinsingly it seems to be a new open problem.
It is reasonable to assume that we have
#(Cullen primes < N) ~ W_c * log_2(N)
#(Woodall primes < N) ~ W_w * log_2(N)
because Sum(1 / log(n * 2^n +/ 1), 1 <= n <= N) ~ log_2(N).
The problem is to know if W_c ~= 0.7 and W_w ~= 1.5 are still correct when N > oo.
I doubt this is true. Because of some relations like if p is odd, then p  C_{p1}, a lot of small Cullen (or Woodall) numbers are composite.
But the density of the prime numbers decrease with their size and finally this sort of relations removes few Cullen/Woodall numbers.
W_c = W_w = 2 is a reasonable hypothesis. 18,000,000 is just to small for having a good estimate.
I suppose Factorial, Primorial, and the WWWW projects. Also the Phi entries on top20 at T5K. Dunno what they're called.
On T5K/Top20, you have a complete definition for each form. Phi are Cyclotomic polynomials.
What is WWWW, I couldn't find anything for some reason
Something like Wilson prime, Wieferich prime, Wall–Sun–Sun prime, Wolstenholme prime but I'm not sure about the list.
I never heard of Mill's primes before, fascinating as well. :) Wouldn't that make a nice subproject? It could produce a lot of large primes. On the other hand that sounds too easy, so there's probably a catch? Is the exact value of A not known and the deviation becomes too large for large n? Or is it too difficult to test for primality of these numbers?
The problem is that it's a bit artificial because you have to compute the primes first and then some digits of A. But it's fascinating that such a number exists. Note that a sequence exists with exponent 2 ([A^{2n}] https://oeis.org/A059784).  

Yves GallotVolunteer developer Project scientist Send message
Joined: 19 Aug 12 Posts: 842 ID: 164101 Credit: 306,511,698 RAC: 5,382

A few more of these:
If p is odd, then p  C_{p1}.
If p is 1 mod 4 and 2 is a fourth power mod p, then p  C_{(9p1)/4}.
If p is 1 mod 4 and 2 is a square but not a fourth power mod p, then p  C_{(7p+1)/4}.
and
If p is odd, then p  C_{p2}.
If p is odd, then p  C{2(p2)}.
You may have done this already, but it might be worthwhile to recalculate the weight with p_max << n_max, to see how much the weight is affected by primes on the order of n.
If p_max = 10M and n_max = 10,000 then W_c = 0.714817 and W_w = 1.56169.
If p_max = 10,000 and n_max = 10M then W_c = 1.99655 and W_w = 1.99784.
I noticed this "instability" and first I thought that we must have p_max >> n_max because the estimates are close to the results of the prime search.
But today I think that I was wrong and that we must have p_max << n_max to "hide" the relations.
I think that we must extend our understanding of Cullen/Woodall divisibility properties before trying to estimate the weights with a sieve.  

BurVolunteer tester
Send message
Joined: 25 Feb 20 Posts: 515 ID: 1241833 Credit: 415,342,448 RAC: 18,445

The problem is that it's a bit artificial because you have to compute the primes first and then some digits of A. I was under the impression that if the Riemann hypothesis is true, then A can be calculated? Or does that just mean, then primes can be used to calculate A?
Is there a chance we will ever be able to calculate A indenpently? To me it feels like, if mathematicians fully understand why A^3^n produces only primes, we should be able to determine A's exact value. Like it is possible to calculate digits of Pi without feeding experimentally determined pairs of C, r into Pi = C/2r.
And thanks for explaning everything to a mathematical layman.  

Ravi FernandoProject administrator Volunteer tester Project scientist Send message
Joined: 21 Mar 19 Posts: 211 ID: 1108183 Credit: 14,457,450 RAC: 4,270

The problem is that it's a bit artificial because you have to compute the primes first and then some digits of A. I was under the impression that if the Riemann hypothesis is true, then A can be calculated? Or does that just mean, then primes can be used to calculate A?
Is there a chance we will ever be able to calculate A indenpently? To me it feels like, if mathematicians fully understand why A^3^n produces only primes, we should be able to determine A's exact value. Like it is possible to calculate digits of Pi without feeding experimentally determined pairs of C, r into Pi = C/2r.
There are infinitely many (uncountably many, even) real numbers x such that floor(x^3^n) is prime for all n > 0. A is just the smallest one. There's nothing particularly special about it, so I doubt there's any reasonable formula for it that doesn't involve computing the primes involved. Even quoting the Riemann hypothesis is overkill hereall that matters is that there's always a prime between n^3 and (n+1)^3, which isn't known unconditionally but does (apparently) follow from RH.
Think about how you would try to calculate Mills' constant by hand. First, you know that floor(A^3) has to be prime, so if you want A to be as small as possible, you'd choose floor(A^3) = 2. This means that 2^(1/3) <= A < 3^(1/3). Then 8 <= A^9 < 27, and floor(A^9) has to be prime too, so the smallest possibility is 11. This further restricts the range that A can lie in. Then 1331 <= A^27 < 1728, so you pick the smallest prime in that range. And so on. Each step of the calculation gives you a better approximation of A.
Normally, there are a lot of primes between n^3 and (n+1)^3. (E.g.: four between 1 and 8, five between 8 and 27, 52 between 1331 and 1728.) It would be extremely surprising if there were an integer n > 0 for which there are no primes in that rangemuch more surprising than RH being false. But if it happened, then you might have to backtrack at some point in your calculatione.g. pick 13 instead of 11. That's the only sense in which the value of A depends on RH.  

Message boards :
Cullen/Woodall prime search :
Are C/W primes more rare than Proth? 