Reply
Level 1
 
Playstation Staff

Re: 1080 for movies, 720 for gaming!


TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540

Please use plain text.
Reply
0 Kudos
Message 11 of 19 (375 Views)
Reply
0 Kudos
Level 3
 
Playstation Staff

Re: 1080 for movies, 720 for gaming!


willj12 wrote:

TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540


Wrong,there both 1920x1080 ............

1080i is technically a higher resolution in terms of the amount that is displayed per frame (NOT to be mixed up with 'per field refresh'). However, thanks to the fact that it refreshes odd and then even fields per frame update (going through 2 update passes to refresh one frame), the actual real realtime representation of the data is inferior. The total frame is still higher resolution, but since the frame is consisted of 2 sets of alternate field refreshes, the detail is ruined because of the alternating fields.

You seem to be of the impression that 1920x1080i basically equates to the same as 1920x540p in terms of detail - this is false. You seem to miss the fact that the data stored in each alternate field refresh is different, and the 2 field refreshes ultimately make 1 single frame. It is the manner of refresh that is different.
What you are saying would only make sense if the frame was only made of one of the field passes (which would hold half the detail/lines of the entire frame), which we know is not the case.

Please use plain text.
Reply
0 Kudos
Message 12 of 19 (370 Views)
Reply
0 Kudos
Level 3
 
Playstation Staff

Re: 1080 for movies, 720 for gaming!

Me also. Preset PS3 to play games/browser at 720p and movies 1080p...how hard can it be?
"we are all in the gutter, but some of us are looking at the stars"
Oscar Wilde
Please use plain text.
Reply
0 Kudos
Message 13 of 19 (349 Views)
Reply
0 Kudos
Level 3
 
Playstation Staff

Re: 1080 for movies, 720 for gaming!

[ Edited ]

kingchaz wrote:

willj12 wrote:

TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540


Wrong,there both 1920x1080 ............

1080i is technically a higher resolution in terms of the amount that is displayed per frame (NOT to be mixed up with 'per field refresh'). However, thanks to the fact that it refreshes odd and then even fields per frame update (going through 2 update passes to refresh one frame), the actual real realtime representation of the data is inferior. The total frame is still higher resolution, but since the frame is consisted of 2 sets of alternate field refreshes, the detail is ruined because of the alternating fields.

You seem to be of the impression that 1920x1080i basically equates to the same as 1920x540p in terms of detail - this is false. You seem to miss the fact that the data stored in each alternate field refresh is different, and the 2 field refreshes ultimately make 1 single frame. It is the manner of refresh that is different.
What you are saying would only make sense if the frame was only made of one of the field passes (which would hold half the detail/lines of the entire frame), which we know is not the case.


 

That's not right either - you're getting your frame rates, refresh rates and progressive/interlaced mixed up . 1080i is an interlaced signal, which means that only evey other line of an image (the odds and evens you refer to) are drawn per frame, and each one of those frames only contain 1920x540 lines. Hence it is the current standard for broadcast HD content as it only requires half the data bandwidth.

 

What you describe as "2 update passes to refresh one frame" is the processing that the TV receiving the signal carries out before displaying the image - a process also known as 'line doubling', where basically the TV processes each pair of consecutive odd/even 1920x540 frames and combines them into a single frame that it can display progressively. This is why a native progressive feed is generally perceived as 'smoother', as the line doubling process has the effect of halving the frame rate of a 1080i signal. So if you had say 1080i @ 60fps, what would be displayed after line doubling would be 1080p @ 30fps. Additionally, due to each odd/even frame being slightly different (or quite a lot different in very fast moving scenes), line doubling produces artefacts where you can see the visible side effect of two halves of two different images being hashed together - the degree to which you see that depends on how good a job your TV's processor does of masking it.

 

But the bottom line is, half the amount of data does equal half the amount of picture. That is counteracted by the process above to create half as many full frame images, but cannot produce the same results as a proper 1080p feed (i.e. Blu-Ray movies) where the entire image is drawn for every frame. So to summarise, 1080p @ 60fps feed really is 60 full images drawn every second at 1920x1080 but 1080i @ 60fps is 60 half images drawn every second at 1920x540, which your TV processes to produce 30 full images drawn every second at 1920x1080, but the result of the latter is a lower quality image.

Message Edited by SpikyCat on 23-08-2008 05:53 AM

Check out my blog: http://www.spikycat.com/
Please use plain text.
Reply
0 Kudos
Message 14 of 19 (341 Views)
Reply
0 Kudos
Level 3
 
Playstation Staff

Re: 1080 for movies, 720 for gaming!

[ Edited ]

SpikyCat wrote:

kingchaz wrote:

willj12 wrote:

TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540


Wrong,there both 1920x1080 ............

1080i is technically a higher resolution in terms of the amount that is displayed per frame (NOT to be mixed up with 'per field refresh'). However, thanks to the fact that it refreshes odd and then even fields per frame update (going through 2 update passes to refresh one frame), the actual real realtime representation of the data is inferior. The total frame is still higher resolution, but since the frame is consisted of 2 sets of alternate field refreshes, the detail is ruined because of the alternating fields.

You seem to be of the impression that 1920x1080i basically equates to the same as 1920x540p in terms of detail - this is false. You seem to miss the fact that the data stored in each alternate field refresh is different, and the 2 field refreshes ultimately make 1 single frame. It is the manner of refresh that is different.
What you are saying would only make sense if the frame was only made of one of the field passes (which would hold half the detail/lines of the entire frame), which we know is not the case.


 

That's not right either - you're getting your frame rates, refresh rates and progressive/interlaced mixed up . 1080i is an interlaced signal, which means that only evey other line of an image (the odds and evens you refer to) are drawn per frame, and each one of those frames only contain 1920x540 lines. Hence it is the current standard for broadcast HD content as it only requires half the data bandwidth.

 

What you describe as "2 update passes to refresh one frame" is the processing that the TV receiving the signal carries out before displaying the image - a process also known as 'line doubling', where basically the TV processes each pair of consecutive odd/even 1920x540 frames and combines them into a single frame that it can display progressively. This is why a native progressive feed is generally perceived as 'smoother', as the line doubling process has the effect of halving the frame rate of a 1080i signal. So if you had say 1080i @ 60fps, what would be displayed after line doubling would be 1080p @ 30fps. Additionally, due to each odd/even frame being slightly different (or quite a lot different in very fast moving scenes), line doubling produces artefacts where you can see the visible side effect of two halves of two different images being hashed together - the degree to which you see that depends on how good a job your TV's processor does of masking it.

 

But the bottom line is, half the amount of data does equal half the amount of picture. That is counteracted by the process above to create half as many full frame images, but cannot produce the same results as a proper 1080p feed (i.e. Blu-Ray movies) where the entire image is drawn for every frame. So to summarise, 1080p @ 60fps feed really is 60 full images drawn every second at 1920x1080 but 1080i @ 60fps is 60 half images drawn every second at 1920x540, which your TV processes to produce 30 full images drawn every second at 1920x1080, but the result of the latter is a lower quality image.

Message Edited by SpikyCat on 23-08-2008 05:53 AM

What you goin on about ,i suggest you read my post again and this time try to understand it.

Youve just repeated what iv said more or less.
Message Edited by kingchaz on 23-08-2008 12:28 PM
Please use plain text.
Reply
0 Kudos
Message 15 of 19 (281 Views)
Reply
0 Kudos
Level 2
 
Playstation Staff

Re: 1080 for movies, 720 for gaming!

I appreciate all the help but for me all I know is that all my blu ray movie eg:batman begins look far better and sharper in 1080i rather than 720p but my games eg:GT5 prologue look rubbish in 1080i.

I would just like an autoswitching option please.

Please use plain text.
Reply
0 Kudos
Message 16 of 19 (268 Views)
Reply
0 Kudos
Level 3
 
Playstation Staff

Re: 1080 for movies, 720 for gaming!


kingchaz wrote:

SpikyCat wrote:

kingchaz wrote:

willj12 wrote:

TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540


Wrong,there both 1920x1080 ............

1080i is technically a higher resolution in terms of the amount that is displayed per frame (NOT to be mixed up with 'per field refresh'). However, thanks to the fact that it refreshes odd and then even fields per frame update (going through 2 update passes to refresh one frame), the actual real realtime representation of the data is inferior. The total frame is still higher resolution, but since the frame is consisted of 2 sets of alternate field refreshes, the detail is ruined because of the alternating fields.

You seem to be of the impression that 1920x1080i basically equates to the same as 1920x540p in terms of detail - this is false. You seem to miss the fact that the data stored in each alternate field refresh is different, and the 2 field refreshes ultimately make 1 single frame. It is the manner of refresh that is different.
What you are saying would only make sense if the frame was only made of one of the field passes (which would hold half the detail/lines of the entire frame), which we know is not the case.


 

That's not right either - you're getting your frame rates, refresh rates and progressive/interlaced mixed up . 1080i is an interlaced signal, which means that only evey other line of an image (the odds and evens you refer to) are drawn per frame, and each one of those frames only contain 1920x540 lines. Hence it is the current standard for broadcast HD content as it only requires half the data bandwidth.

 

What you describe as "2 update passes to refresh one frame" is the processing that the TV receiving the signal carries out before displaying the image - a process also known as 'line doubling', where basically the TV processes each pair of consecutive odd/even 1920x540 frames and combines them into a single frame that it can display progressively. This is why a native progressive feed is generally perceived as 'smoother', as the line doubling process has the effect of halving the frame rate of a 1080i signal. So if you had say 1080i @ 60fps, what would be displayed after line doubling would be 1080p @ 30fps. Additionally, due to each odd/even frame being slightly different (or quite a lot different in very fast moving scenes), line doubling produces artefacts where you can see the visible side effect of two halves of two different images being hashed together - the degree to which you see that depends on how good a job your TV's processor does of masking it.

 

But the bottom line is, half the amount of data does equal half the amount of picture. That is counteracted by the process above to create half as many full frame images, but cannot produce the same results as a proper 1080p feed (i.e. Blu-Ray movies) where the entire image is drawn for every frame. So to summarise, 1080p @ 60fps feed really is 60 full images drawn every second at 1920x1080 but 1080i @ 60fps is 60 half images drawn every second at 1920x540, which your TV processes to produce 30 full images drawn every second at 1920x1080, but the result of the latter is a lower quality image.

Message Edited by SpikyCat on 23-08-2008 05:53 AM

What you goin on about ,i suggest you read my post again and this time try to understand it.

Youve just repeated what iv said more or less.
Message Edited by kingchaz on 23-08-2008 12:28 PM

No really, I did understand what you said, but what I'm saying is that, contrary to what you said, it is not actually wrong to say a 1080i feed is 1920x540, because it is made up of frames of that resolution. Each pair of 1920x540 frames only becomes a single 1920x1080 frame as a result of the TV's picture processing. And in fact, on a 720p/768p native panel, it will be downscaled anyway, so 1080i becomes irrelevant. There are some old HDTVs that have a native resolution of 1920x1080 but can only handle interlaced rather than progressive signals, so inputting a 1080i signal to one of those TVs would result in an interlaced image made up of successive frames of 1920x540.

 

Might be technicalities, but people get easily confused about these things and end up convincing themselves that they're seeing extra detail with 1080i. :smileywink:


Check out my blog: http://www.spikycat.com/
Please use plain text.
Reply
0 Kudos
Message 17 of 19 (247 Views)
Reply
0 Kudos
Level 3
 
Playstation Staff

Re: 1080 for movies, 720 for gaming!

Im not going to go in much detail as i dont have time, but to put it bluntly, 1080i is really kind of pointless.

 

The 'p' will always be better than 'i' when watching movies or playing games. When the TV has messed about with all the scaling it has to do, you'll just end up with a worse picture anyway

Please use plain text.
Reply
0 Kudos
Message 18 of 19 (236 Views)
Reply
0 Kudos
Level 3
 
Playstation Staff

Re: 1080 for movies, 720 for gaming!


SpikyCat wrote:

kingchaz wrote:

SpikyCat wrote:

kingchaz wrote:

willj12 wrote:

TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540


Wrong,there both 1920x1080 ............

1080i is technically a higher resolution in terms of the amount that is displayed per frame (NOT to be mixed up with 'per field refresh'). However, thanks to the fact that it refreshes odd and then even fields per frame update (going through 2 update passes to refresh one frame), the actual real realtime representation of the data is inferior. The total frame is still higher resolution, but since the frame is consisted of 2 sets of alternate field refreshes, the detail is ruined because of the alternating fields.

You seem to be of the impression that 1920x1080i basically equates to the same as 1920x540p in terms of detail - this is false. You seem to miss the fact that the data stored in each alternate field refresh is different, and the 2 field refreshes ultimately make 1 single frame. It is the manner of refresh that is different.
What you are saying would only make sense if the frame was only made of one of the field passes (which would hold half the detail/lines of the entire frame), which we know is not the case.


 

That's not right either - you're getting your frame rates, refresh rates and progressive/interlaced mixed up . 1080i is an interlaced signal, which means that only evey other line of an image (the odds and evens you refer to) are drawn per frame, and each one of those frames only contain 1920x540 lines. Hence it is the current standard for broadcast HD content as it only requires half the data bandwidth.

 

What you describe as "2 update passes to refresh one frame" is the processing that the TV receiving the signal carries out before displaying the image - a process also known as 'line doubling', where basically the TV processes each pair of consecutive odd/even 1920x540 frames and combines them into a single frame that it can display progressively. This is why a native progressive feed is generally perceived as 'smoother', as the line doubling process has the effect of halving the frame rate of a 1080i signal. So if you had say 1080i @ 60fps, what would be displayed after line doubling would be 1080p @ 30fps. Additionally, due to each odd/even frame being slightly different (or quite a lot different in very fast moving scenes), line doubling produces artefacts where you can see the visible side effect of two halves of two different images being hashed together - the degree to which you see that depends on how good a job your TV's processor does of masking it.

 

But the bottom line is, half the amount of data does equal half the amount of picture. That is counteracted by the process above to create half as many full frame images, but cannot produce the same results as a proper 1080p feed (i.e. Blu-Ray movies) where the entire image is drawn for every frame. So to summarise, 1080p @ 60fps feed really is 60 full images drawn every second at 1920x1080 but 1080i @ 60fps is 60 half images drawn every second at 1920x540, which your TV processes to produce 30 full images drawn every second at 1920x1080, but the result of the latter is a lower quality image.

Message Edited by SpikyCat on 23-08-2008 05:53 AM

What you goin on about ,i suggest you read my post again and this time try to understand it.

Youve just repeated what iv said more or less.
Message Edited by kingchaz on 23-08-2008 12:28 PM

No really, I did understand what you said, but what I'm saying is that, contrary to what you said, it is not actually wrong to say a 1080i feed is 1920x540, because it is made up of frames of that resolution. Each pair of 1920x540 frames only becomes a single 1920x1080 frame as a result of the TV's picture processing. And in fact, on a 720p/768p native panel, it will be downscaled anyway, so 1080i becomes irrelevant. There are some old HDTVs that have a native resolution of 1920x1080 but can only handle interlaced rather than progressive signals, so inputting a 1080i signal to one of those TVs would result in an interlaced image made up of successive frames of 1920x540.

 

Might be technicalities, but people get easily confused about these things and end up convincing themselves that they're seeing extra detail with 1080i. :smileywink:


 

Agree on this we will.:smileyvery-happy:
Please use plain text.
Reply
0 Kudos
Message 19 of 19 (230 Views)
Reply
0 Kudos