Best Selling HDTVs

August 14, 2006

1080i vs. 1080p - What's the Real Story?

It's just amazing to me how TV manufacturers seem to make HDTV buying so confusing with all of the terms and jargon we don't understand. One question that really confuses consumers is "Should I buy a 1080i or 1080p display"? While the are several articles on the web that answer that question, I really like this article at HomeTheaterMag.com that starts out:

There has been a lot of concern and confusion over the difference between 1080i and 1080p. This stems from the inability of many TVs to accept 1080p. To make matters worse, the help lines at many of the TV manufacturers (that means you, Sony), are telling people that their newly-bought 1080p displays are really 1080i. They are idiots, so let me say this in big bold print, as far as movies are concerned THERE IS NO DIFFERENCE BETWEEN 1080i AND 1080p. See, I did it in caps too, so it must be true. Let me explain (if your eyes glaze over, the short version is at the end).

For clarification, let me start by saying that there are essentially no 1080i TVs anymore. Unless you bought a CRT based TV, every modern TV is progressive scan (as in LCD, Plasma, LCOS, DLP). They are incapable of displaying a 1080i signal as 1080i. So what we’re talking about here mostly applies to people with 1080p native displays.

If there's an HDTV in your future, be sure to check this really informative article out.

At HomeTheaterMag.com

Compare Prices: 1080i HDTVs

Compare Prices: 1080p HDTVs

Read More in: HDTV

Share this Article with others: social bookmarking

Related Articles:

Came straight to this page? Visit TV Snob for all the latest news.

Posted by William Hungerford at August 14, 2006 10:34 AM

Recent Comments

Some of you people have no freaking clue.

per second:
480i = 640x240x60hz to render up to 30fps video
480p = 640x480x60hz to render up to 60fps video
720p = 1280x720x60hz to render up to 60fps video
1080i= 1920x540x60hz to render up to 30fps video
1080p= 1920x1080x60hz to render up to 60fps video

This means that a 1080i capable CRT will render Film(24p) and Broadcast(30p) exactly the same as a 1080p display will unless the display can render 120hz. However, even a 1080p plasma is inferior in overall picture to a 1080i(native) CRT display. Every 1080p capable display on the market is inferior, in some way, to a CRT based display.
That said, 1080p video that is recorded at more than 30fps cannot be properly played back at 1080i

In other words, as a general rule, 720p is inferior to 1080i, in almost every way, unless the source video is 60FPS. 1080p is THE SAME as 1080i unless the source video is 60FPS or the display is 120hz capable and the source video is 24p. (24x5=120)


Posted by: Steve at April 1, 2010 8:50 PM

Erick: DVD is 480p, not 720p.


Posted by: somebody at March 21, 2010 12:59 AM

Is it also a fact that a DVD can only provide a 720p resolution and a BluRay player can provide a 1080p resolution? Can a DVD upconvert player provide 1080p? Can a BluRay player upconvert DVD to 1080p. Is a bluebird blue? 480/720 or 1080p? Gets kinda deep in tv land.


Posted by: Dave at March 11, 2010 7:21 PM

Last philosophic comment was not compatible with tech stuff...

Well, as far as I know, if you wish an effective pixel assignation and not by averaging, P is the letter. If you wish a lower resolution Upscaled to fit 1080x1920, "I" would be the one.


Posted by: Erick at December 11, 2009 8:59 PM

just but a upscale cable to watch 1080p. easy


Posted by: Raptor at October 20, 2009 6:06 AM

These arguements remind me of the early 70's when stereo manufactures were arguing about how clean their amplifiers produced sound. The numbers posted by the manufacturers were outside the normal hearing range of all humans on the planet. So they began to argue about if we could actually hear the quality of this absence of sound or not. When we watch or listen to anything, do we reeeally pay attention to how fast the TV refreshes itself or if we can hear the absence of sound we can't hear.


Posted by: Doug H at June 21, 2009 11:47 PM

To Larry Thompson, just because it costs more doesnt make it better, and the correct word is layman. But in the past 2 years it seems that P has caught up with I. Will it always be that way, maybe.


Posted by: Harry Hallzkac at June 2, 2009 7:28 AM

1920 picels left to right, right to left by 1080 picels top to bottom, bottom to top.
i=interlaced
p=progressive scan
EITHER WAY THE PICTURE IS ONLY AS GOOD AS AS THE SOURCE. POOR INPUT = POOR OUTPUT!
UPSCALING A DVD TO 1920x1080 IS JUST MAKING PICTURE BIGGER NOT BETTER! IF YOUR SOURCE
IS 800x600 IT WILL LOOK OF POOR QUALITY ON A MASSIVE 180CM TV NO MATTER HOW GOOD YOUR TV IS (even with upscaling).
HD (1920x1080 "p" or "i") ON A 22 INCH TV IS MADE SMALLER TO FIT THE SMALL DISPLAY, SO QUALITY IS LOST.
I'm a pc gamer who knows how to get the best out of a tv or monitor. All I'm trying to say is your tv is only as good as the signal you put in.

Any questions???


Posted by: The Noobinator at August 11, 2008 7:41 PM

After reading all the comments, I am still not sure which format is better.


Posted by: Ed at August 2, 2008 12:06 AM

Phew Wicked good job both our 42" Samsung HD DLP TV accepts 1080p and our 62" LG HD DLP TV accepts 1080p. I never knew about any realy differnce till about 2 months ago when I was about to get a ps3 for GTA 4.

well thats the Tv's sorted...

until we can get robots!


Posted by: lunasea at June 5, 2008 7:49 PM

Get a life all of you, YOU FAIL


Posted by: Jamie and oliver at May 22, 2008 5:17 AM

This is incredibly simple
1080p = 1920x1080 @ 60fps
1080i = 1920x1080 @ about 30fps
720p = 1366x768 @ 60fps
720i = 1366x768 @ 30fps

Number = number of horizontal lines.
i = interlaced, whether it combines or not is neither here not there, the framerate appears slower
p = progressive, non interlaced, simple displays the picture as it is at the full 60fps (max refresh rate of TV) because there is no recomination to be done?

See?


Posted by: Jon at December 25, 2007 8:50 AM

Who gives a shit! i got two samsungs 42P and a older 46I LCD...they look the same!


Posted by: Flipper at December 24, 2007 6:01 PM

Even if your display is 1080p you need HD Content before you get 'true' HD quality. Either the broadcast should be HD or the Video output.

If you are using an HD Player like Blu ray or HD VMD you will get 1080p output provided the content is in an HD format. If it is Standard DVD that you are playing thru your HD VMD or Blu Ray you will get an upscaled version of 1080i that in turn will be de-interlaced by some TVs to give you a 1080p output or a 720p output.

Incidentally, 1080i is better for watching live sports telecasts where there is fast action in the frames. Interlaced frame rate is twice as much as progressive scan; hence it is able to give you a better non-jerky picture in action films.


Posted by: netra at November 20, 2007 12:54 AM

I find it funny how people jump head first for good marketing gimicks. 1080i vs 1080p is just a great lesson in business. Owning a HDTV isn't just about the visual quality its about status, the companies capitalize on this because they know there are a lot of people out there who need to know they have the "best."

Once we pass the limits of our vision, I fully expect some "videophile" to get an eye transplant and begin to blog about their TV's ultra resolution.


Posted by: Oledurt at November 15, 2007 10:56 PM

so your saying it dosent matter if you buy a tv with 1080p or 1080i cuse they are both almost the same correct thanks luis


Posted by: luis at November 4, 2007 4:35 PM

All you idiots that like to scream and yell about your 1080p vs 1080i go to this website. It will give you the facts not the oppions of these lame losers with hot attitudes. http://www.hometheatermag.com/gearworks/1106gear/ Bottom line... 1080p is better in the long run. it has more pro's with less con's but visually there is no difference them the only difference is the mechanical aspect.


Posted by: Tysoc at July 1, 2007 9:01 PM

YO, dumbass. Let us look at in a very simple way. The 1080p sets cost more than the 1080i sets. How do you explain that? Think before you call people "idiots". If a TV could acutually show 1080p, then why would the manufacturer not call it 1080p? Why are there some TVs being advertised as 1080i? You want us to believe the manufacturer is stupid?

Now for a more technical argument which will surely obliterate your baseless claims. There are many parts to a TV. First, there is the display panel itself, then there is the processing hardware that drives the display. So, any incoming signal will pass through this processer which then decodes this signal and displays it on the screen. Now, to display 1080i, these displays do infact deinterlace them. BUT THE FACT STILL REMAINS THAT THIS IS WORSE THAN 1080p (which does not need to be deinterlaced). The deinterlacing does reduce its quality. What you are really trying to say is that the display panel itself(plasma or LCD), if capable of displaying 1080i, is also be capable of displaying 1080p. This is VERY true. But you are missing out one vital detail. Some processing hardware (the processing chip and other DSPs), due to bandwidth limitations CANNOT accept 1080p signals. they can only downsample these signals to display them. This is why some TVs are labelled 1080i even though the display panels themselves are always capable of displaying 1080p. But the TV as a whole CANNOT display 1080p in its full quality due to the crappy chips they put in them.

Now, for the final explanation for the lay man:

Your 1080i TV SUCKS,
My 1080p TV KICKS ASS.


Posted by: Sid at May 29, 2007 11:48 AM

If a picture is worth a 1000 words is it worth more in 1080p?


Posted by: Larry Thompson at May 16, 2007 8:04 PM

For regular cable viewing, sports, DVD's etc does it really matter if i purchase a 720p, 1080i or 1080p lcd? From what i am finding online no one seems to agree what is the best.


Posted by: keith at January 22, 2007 9:26 PM

1080i and 1080p are different!

The only difference is the way the the lines are loaded onto the screen. Interlaced images load lines through odd's and evens. The even's will load first shortly followed by odd's. eg. 2,4,6,8,10 will load, then 1,3,5,7,9 etc... etc..

1080p ... Progressive scan images will load sequentially which will result in the same 1080 resolution. No image quality difference if your watching a true 1080HD film or programme.

The only difference is when you watch your standard definition programming or dvd's on a 1080i display. Progressive scan sets will display standard definition far better than interlaced display technology.

But then as Scott says (Just above this comment) You don't really want to be playing standard definition through 1080i anyway. 720p is about as high as you can go without noticing any substantial image quality loss from standard definition.


Posted by: Chris at January 10, 2007 6:53 PM

i am looking to buy an new upconverting DVD player ( Oppo DV-970HD) more than likely its a 1080i player or should i get the newer ones that are just coming out that are 1080p

BTW i have a Sony SXRD 60 inch KDS-60A2000

i am thinking it doesn't matter being its best to set the res. at 720p watch a normal DVD ,am i correct?


Posted by: Scott at November 3, 2006 5:02 AM

i see lines in 1080i and i i dont see those little lines in 1080p. i am using vegas 6.0 evrything is in HDV.how do i burn in 1080i or 1080p on a dvd or blue ray. can i burn or am i limited to 720x576 neil


Posted by: neil at October 5, 2006 2:58 PM

1440x768 is the broadcast limitation.

Check out wikipedia, search 1080i...there's a good illustration on the bottom of the page.

1080p sets blow away 1080i sets if the processing DSP hardware/software is designed well, especially if everything is upconverted to 1080p - as in the case of the newest Sony X and Bravia HD sets. Later...


Posted by: amatot at September 15, 2006 12:16 PM

You are an idiot because 1440x768 is not 1080i. That is 720p. He is correct about the TV's displaying progessive scan also.


Posted by: Darwin at September 11, 2006 4:22 PM

The display is ALWAYS 1080p. The interlaced signal is received - The signals are combined and then displayed. The time it takes to combine the two signals is what reduces the frame rate. Where did you come up with 1440X768 for 1080i? BTW I own a 1080p set........there is NO difference.


Posted by: Jim at September 11, 2006 9:58 AM

The person who wrote this article is an idiot.
1080p - 1920x 1080
1080i - 1440x 768
1080p is a substantial gain in quality (as well as being progressive instead interlaced). For still images you may not see a differnce but for high motion there is a HUGE difference.


Posted by: rich at August 31, 2006 2:22 PM
Post a comment









Remember personal info?




Please enter the letter "o" in the field below:
Please press Post only once. Submission of comments takes up to 20 seconds because of Spam Filtering.
Steals & Deals
See all of the TV Snob Steals and Deals

Join the Mailing List Mailing List
Enter your Email


Powered by FeedBlitz
Subscribe - RSS

facebook_badge.jpg twitter_badge.jpg

Site Navigation

Visit our other properties at Blogpire.com!

Archives

TechPiree

This weblog is licensed under a Creative Commons License.

Powered by
Movable Type 6.3
All items Copyright © 1999-2017 Blogpire Productions. Please read our Disclaimer and Privacy Policy