Among the confusions you may encounter when buying a new HDTV is the difference between 1080i and 1080p. Though they’re both high-definition formats with 1920-by-1080 screen resolution, you’re probably wondering which one is superior in viewing quality. More importantly, do they really have an impact on your buying decision?
The main difference between 1080i and 1080p is in how these formats show things on the screen. 1080i, the abbreviation of 1080 interlaced scan, displays images by alternately lighting up odd and even pixel rows. In other words, the 540 odd rows are shown in one frame, followed by the 540 even rows in the next frame. Both sets of rows are displayed so fast – each set is shown 30 times every second—that the constant swapping isn’t noticeable by the human eye.
1080p or 1080 progressive scan, on the other hand, draws every pixel row progressively – row after row, refreshed 60 times every second. This technique is said to produce smoother and more detailed images, especially in fast motion scenes. Such advantage is one reason why 1080p is marketed as the true HD or full HD, to distinguish it from 1080i and 720p. This is also why 1080p is the better choice over 1080i.
However, the difference in image quality is subtle when you’re viewing things on a TV that’s smaller than 42 inches. The technical superiority of 1080p even becomes more negligible as your eyesight deteriorates and as you sit farther away from the TV. Unless you’re really fussy about the specifications, you don’t have to choose just because of a supposed substantial advantage of one format over the other. Other factors affect TV image quality, too. And in any case, newer technologies are making 1080p and 1080i obsolete. Soon, 4K or Ultra HD will become mainstream, and people will be drooling over HDTVs touting that resolution instead.