"Humans perceive flicker artifacts at 500?Hz"
Skip 500Hz and go for positive integer multiples of 600Hz
since 600Hz matches nicely as a multiple of most video sources
factor(600)=2^3 * 3^1 * 5^2
15*40=600, 24*25=600, 25*24=600, 30*20=600, 50*12=600, 60*10=600, 100*6=600, 120*5=600, ...
oh wait... "several viewers reported visibility of flicker artifacts at over 800?Hz"
might have to go for 1200Hz now..., or if you believe in doing everything at least 2x just to be sure then 2400Hz.
[p.s. they only used like 2-10 subjects for these tests]
Anyway the true question is always "is the difference worth the cost?", not can it be measured that you can see the difference in some way, although this knowledge can be helpful in determining rather or not "the difference worth the cost".