Double that and you're close.
Rather than throw out vague numbers, lets clear this up.
According to
http://en.wikipedia.org/wiki/Visual_acuity:
"20/20 is the visual acuity needed to discriminate two points separated by 1 arc minute—about 1/16 of an inch at 20 feet."
20/20 vision refers to the acuity needed to resolve characters centered on lines 20 mm apart from a distance of 20 feet, and 1 arc minute is exactly the same as 1/60th of a degree of arc.
1/60th of a degree of arc may not be EXACTLY the limit of the resolution of the fovea of EVERY human eye, but it is what the vision system is capable of in the great majority of cases; 90% of the world population does not have visual acuity as good as (or better) than 20/20, and a very large percentage of them do not even approach 20/20 acuity, even at their life's peak of health.
Plus, we do not see with 20/20 vision all of the time. Only a few percent of the visual field, foveal vision, had much acuity at all. 97% of so of what we see in any one moment is seen with very little acuity and is actually very blurry.
We compensate continually for this by using the nystagmus characteristic of the eye, meaning that the eye moves in small increments very quickly so that we can "take a lot of quick snapshots" in serial order so that we can see tiny parts of the field of vision, of what we are looking directly at, in focus, one after the other. The brain then stitches these snapshots together as a part of perception to simulate that we are seeing more of the visual field in clear focus than we really actually are. You are doing exactly that as you read this line of text; only every other word or so is ever in focus at any particular time.
20/16 or better probably refers to about the 98th percentile, and 20/8 is probably the absolute limit (for the one person in 20 billion people ever to walk the planet that could see better than every other single person that ever lived). If you wish to refer to what the limit is for 99% of the world's population, it is then approximately 20/10 or below, so in that way your statement might actually have a grain of truth. But 20/10 is the extremely rare exception, not the norm.
And 20/20 is the accepted standard that optometry targets to correct for, because they assume that this is the level of visual acuity that the bulk of the healthy population has. It is probably at a bare minimum representative of the 90th percentile or better, and improving vision beyond 20/20 hits a wall of diminishing returns very quickly, as far as any particular benefit of higher visual acuity is concerned, and that is why it is the accepted standard.
And this is also exactly the same standard that was used to develop HDTV resolutions adapted by the Grand Alliance, which implies that they researched this pretty thoroughly at the time. Most people sit further away than the optimum distance to resolve 20/20 vision, and so ironically do not fully receive the benefits of resolutions even this high; you have to sit 7.8 feet away or closer from a 60" screen to fully resolve HD, while most people have a 52-55" screen and sit 12-15 feet away from that.
The point is that once you produce a display that exceeds the visual acuity of 90% of the population (where the other 10% may really only see marginally all that much better and even if they did it would not really buy them much), the tradeoffs ramp up quickly and it becomes an unsound endeavor. You have to draw that line somewhere, and where that appears to be practical, is to produce "retina" displays that resolve fully for folks that have 20/20 vision, and not higher than that. And that means the reasonable target probably should be exactly as I stated: 1/60th of a degree of arc.