Please explain to me how I should be reading that graph.
I think the best way to visualize what it is trying to convey is to imagine you are far away (i.e. high on the vertical axis of the graph) from the display in question and moving towards it.
So, for a 50" display, you first notice the improvement of 720p at around 12' away from the display, 1080p at 9', and 4k at 5'.
That's all great from a theoretical point of view, but how many people really want to sit only 5 feet away from a 50" panel?
1080 is still 1080, whether it's i or p. So if you've got 1080 lines on the source, it is best reproduced with a display that has 1080 native lines.
No argument there, that's basic signal processing theory. My point was that the vast majority of the content that people are watching on 1080p capable panels is only 720p. A lot of network TV content starts life as 1080i (that's what you'll get OTA) but by the time it comes out of a cable box's HDMI cable, it's been down sampled to 720p which then has to be upscaled to 1080.