Reply
Want to join in?

Sign in to continue

I have a PSN account

Welcome back!

I'm new!

it only takes a minute or two

  • 21

    New messages in the last hour

  • Get a response within minutes

  • 92%

    92% percent of messages responded to

22 Aug 2008
By Barso

Barso

  • Offline
80%
02 Rank
Progress
Playstation Staff

Problem Solved

View Original Post

1080 for movies, 720 for gaming!

18 Replies 607 Views Created 22-08-2008

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.

Reply
0 Kudos
LABELS:
View our house rules for posts
Post Reply Update Reply Cancel
04
/16

guyvernoid22

  • Offline
83%
04 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
guyvernoid22
Accepted Solution
Your answer is in a new TV! My 1080p TV automatically sets the resolution for when I put a game in or a blu-ray movie.
Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

[Status of the day: Helghan conquered!] KUDOS POWER [Clan: HKI ]

02
/16

daveac

  • Offline
88%
02 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
daveac
Accepted Solution

I don't bother downloading the movie clips at 720p - I much prefer 1080p clip as they will give a better idea of the actual picture on the Blu-Ray release.

 

Cheers, daveac

 

Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

------------------------------------------------------------------------------------------------------------------------------------------------------------

my vblog - daveac.blip.tv, my blog - daveac.wordpress.com, co-host The Cultdom Collective Podcast Talkshoe ID 54821, Cultdom Chronicles blog at cultdom.com
05
/16

kingchaz

  • Offline
94%
05 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
kingchaz
Accepted Solution

guyvernoid22 wrote:
Your answer is in a new TV! My 1080p TV automatically sets the resolution for when I put a game in or a blu-ray movie.


Your TV is scaling.

May i ask does your TV allow 1 on 1 pixel mapping or have an just scan option ? if yes i would let the ps3 set the rez and select either of those options.

Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

02
/16

willj12

  • Offline
60%
02 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
willj12
Accepted Solution

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.

Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

02
/16

rawmetal

  • Offline
80%
02 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
rawmetal
Accepted Solution
i think there is a option in the display options to set it to auto on the ps3 so you do not have to do it yourself all the time.
Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

03
/16

DefiLe

  • Offline
84%
03 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
DefiLe
Accepted Solution

Is 720p vs 1080i worth being concerned about? Yes and no. If you're a consumer looking for a new TV, you can happily ignore the 720p vs 1080i debate because every TV which is described as HDTV or HDTV Ready is required to support both formats.

NOTE: You should be aware though that lots of TVs which support 1080i have fewer
than 1080 lines and so scale the 1080 signal down. That's not a huge issue as even
scaled down 1080i is far ahead of a regular NTSC signal. It is worth bearing in mind
that more expensive HDTVs tend to have better scalers than cheaper ones, and this
may be an issue.

However, for broadcasters it's a live issue. Should they broadcast 1080 lines of
interlaced video or 720 lines of progressive scan? They could just broadcast two
signals, one in each format, but that would use up a huge chunk of bandwidth and
be hugely expensive for very little gain.

To answer the question, it's important to understand the difference between 720p
vs 1080i. A 720p signal is made up of 720 horizontal lines. Each frame is displayed
in its entirety on-screen for 1/30th of a second. This is know as progressive scan
(hence the 'p')The quality is like watching 30 photographic images a second on TV.
A 1080i signal comprises 1080 horizontal lines but all the lines are not displayed
on-screen simultaneously. Instead, they are interlaced (hence the 'i'), ie every other
lines is displayed for 1/60th of a second and then the alternate lines are displayed
for 1/60th of a second. So, the frame rate is still 30 frames per second, but each
frame is split into two fields, which your brain then puts together subconsciously.

Most of the time interlacing works fine, but for fast moving images, such as sports
like baseball and hockey it can cause problems which manifest themselves as a
'stepping' effect on-screen. Progressive scan signals don't have this problem and so
are better suited to sports.

ESPN puts it like this: 'Progressive scan technology produces better images for the
fast moving orientation of sports television. Simply put, with 104 mph fastballs in
baseball and 120 mph shots on goal in hockey, the line-by-line basis of progressive
scan technology better captures the inherent fast action of sports. For ESPN,
progressive scan technology makes perfect sense.'

Bottom line? For us, as consumers 720p vs 1080i is not a debate worth worrying
about, so you can relax and focus on all the other criteria on your list when you buy
your next HDTV.

Kenny Hemphill is the editor and publisher of The HDTV Tuner - a guide to the kit, the technology and the programming on HDTV.

Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

04
/16

TrueSlawter

  • Offline
74%
04 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
TrueSlawter
Accepted Solution

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?
Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

02
/16

willj12

  • Offline
60%
02 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
willj12
Accepted Solution

TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540

Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

05
/16

kingchaz

  • Offline
94%
05 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
kingchaz
Accepted Solution

willj12 wrote:

TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540


Wrong,there both 1920x1080 ............

1080i is technically a higher resolution in terms of the amount that is displayed per frame (NOT to be mixed up with 'per field refresh'). However, thanks to the fact that it refreshes odd and then even fields per frame update (going through 2 update passes to refresh one frame), the actual real realtime representation of the data is inferior. The total frame is still higher resolution, but since the frame is consisted of 2 sets of alternate field refreshes, the detail is ruined because of the alternating fields.

You seem to be of the impression that 1920x1080i basically equates to the same as 1920x540p in terms of detail - this is false. You seem to miss the fact that the data stored in each alternate field refresh is different, and the 2 field refreshes ultimately make 1 single frame. It is the manner of refresh that is different.
What you are saying would only make sense if the frame was only made of one of the field passes (which would hold half the detail/lines of the entire frame), which we know is not the case.

Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

Beta Tester
02
/16

SpikyCat

  • Beta Tester
  • Offline
80%
02 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
SpikyCat
Accepted Solution

kingchaz wrote:

willj12 wrote:

TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540


Wrong,there both 1920x1080 ............

1080i is technically a higher resolution in terms of the amount that is displayed per frame (NOT to be mixed up with 'per field refresh'). However, thanks to the fact that it refreshes odd and then even fields per frame update (going through 2 update passes to refresh one frame), the actual real realtime representation of the data is inferior. The total frame is still higher resolution, but since the frame is consisted of 2 sets of alternate field refreshes, the detail is ruined because of the alternating fields.

You seem to be of the impression that 1920x1080i basically equates to the same as 1920x540p in terms of detail - this is false. You seem to miss the fact that the data stored in each alternate field refresh is different, and the 2 field refreshes ultimately make 1 single frame. It is the manner of refresh that is different.
What you are saying would only make sense if the frame was only made of one of the field passes (which would hold half the detail/lines of the entire frame), which we know is not the case.


 

That's not right either - you're getting your frame rates, refresh rates and progressive/interlaced mixed up . 1080i is an interlaced signal, which means that only evey other line of an image (the odds and evens you refer to) are drawn per frame, and each one of those frames only contain 1920x540 lines. Hence it is the current standard for broadcast HD content as it only requires half the data bandwidth.

 

What you describe as "2 update passes to refresh one frame" is the processing that the TV receiving the signal carries out before displaying the image - a process also known as 'line doubling', where basically the TV processes each pair of consecutive odd/even 1920x540 frames and combines them into a single frame that it can display progressively. This is why a native progressive feed is generally perceived as 'smoother', as the line doubling process has the effect of halving the frame rate of a 1080i signal. So if you had say 1080i @ 60fps, what would be displayed after line doubling would be 1080p @ 30fps. Additionally, due to each odd/even frame being slightly different (or quite a lot different in very fast moving scenes), line doubling produces artefacts where you can see the visible side effect of two halves of two different images being hashed together - the degree to which you see that depends on how good a job your TV's processor does of masking it.

 

But the bottom line is, half the amount of data does equal half the amount of picture. That is counteracted by the process above to create half as many full frame images, but cannot produce the same results as a proper 1080p feed (i.e. Blu-Ray movies) where the entire image is drawn for every frame. So to summarise, 1080p @ 60fps feed really is 60 full images drawn every second at 1920x1080 but 1080i @ 60fps is 60 half images drawn every second at 1920x540, which your TV processes to produce 30 full images drawn every second at 1920x1080, but the result of the latter is a lower quality image.

Message Edited by SpikyCat on 23-08-2008 05:53 AM
Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

[ Edited ]

Check out my blog: http://www.spikycat.com/
05
/16

kingchaz

  • Offline
94%
05 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
kingchaz
Accepted Solution

SpikyCat wrote:

kingchaz wrote:

willj12 wrote:

TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540


Wrong,there both 1920x1080 ............

1080i is technically a higher resolution in terms of the amount that is displayed per frame (NOT to be mixed up with 'per field refresh'). However, thanks to the fact that it refreshes odd and then even fields per frame update (going through 2 update passes to refresh one frame), the actual real realtime representation of the data is inferior. The total frame is still higher resolution, but since the frame is consisted of 2 sets of alternate field refreshes, the detail is ruined because of the alternating fields.

You seem to be of the impression that 1920x1080i basically equates to the same as 1920x540p in terms of detail - this is false. You seem to miss the fact that the data stored in each alternate field refresh is different, and the 2 field refreshes ultimately make 1 single frame. It is the manner of refresh that is different.
What you are saying would only make sense if the frame was only made of one of the field passes (which would hold half the detail/lines of the entire frame), which we know is not the case.


 

That's not right either - you're getting your frame rates, refresh rates and progressive/interlaced mixed up . 1080i is an interlaced signal, which means that only evey other line of an image (the odds and evens you refer to) are drawn per frame, and each one of those frames only contain 1920x540 lines. Hence it is the current standard for broadcast HD content as it only requires half the data bandwidth.

 

What you describe as "2 update passes to refresh one frame" is the processing that the TV receiving the signal carries out before displaying the image - a process also known as 'line doubling', where basically the TV processes each pair of consecutive odd/even 1920x540 frames and combines them into a single frame that it can display progressively. This is why a native progressive feed is generally perceived as 'smoother', as the line doubling process has the effect of halving the frame rate of a 1080i signal. So if you had say 1080i @ 60fps, what would be displayed after line doubling would be 1080p @ 30fps. Additionally, due to each odd/even frame being slightly different (or quite a lot different in very fast moving scenes), line doubling produces artefacts where you can see the visible side effect of two halves of two different images being hashed together - the degree to which you see that depends on how good a job your TV's processor does of masking it.

 

But the bottom line is, half the amount of data does equal half the amount of picture. That is counteracted by the process above to create half as many full frame images, but cannot produce the same results as a proper 1080p feed (i.e. Blu-Ray movies) where the entire image is drawn for every frame. So to summarise, 1080p @ 60fps feed really is 60 full images drawn every second at 1920x1080 but 1080i @ 60fps is 60 half images drawn every second at 1920x540, which your TV processes to produce 30 full images drawn every second at 1920x1080, but the result of the latter is a lower quality image.

Message Edited by SpikyCat on 23-08-2008 05:53 AM

What you goin on about ,i suggest you read my post again and this time try to understand it.

Youve just repeated what iv said more or less.
Message Edited by kingchaz on 23-08-2008 12:28 PM
Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

[ Edited ]
02
/16

Barso

  • Offline
80%
02 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
Barso
Accepted Solution

I appreciate all the help but for me all I know is that all my blu ray movie eg:batman begins look far better and sharper in 1080i rather than 720p but my games eg:GT5 prologue look rubbish in 1080i.

I would just like an autoswitching option please.

Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

Beta Tester
02
/16

SpikyCat

  • Beta Tester
  • Offline
80%
02 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
SpikyCat
Accepted Solution

kingchaz wrote:

SpikyCat wrote:

kingchaz wrote:

willj12 wrote:

TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540


Wrong,there both 1920x1080 ............

1080i is technically a higher resolution in terms of the amount that is displayed per frame (NOT to be mixed up with 'per field refresh'). However, thanks to the fact that it refreshes odd and then even fields per frame update (going through 2 update passes to refresh one frame), the actual real realtime representation of the data is inferior. The total frame is still higher resolution, but since the frame is consisted of 2 sets of alternate field refreshes, the detail is ruined because of the alternating fields.

You seem to be of the impression that 1920x1080i basically equates to the same as 1920x540p in terms of detail - this is false. You seem to miss the fact that the data stored in each alternate field refresh is different, and the 2 field refreshes ultimately make 1 single frame. It is the manner of refresh that is different.
What you are saying would only make sense if the frame was only made of one of the field passes (which would hold half the detail/lines of the entire frame), which we know is not the case.


 

That's not right either - you're getting your frame rates, refresh rates and progressive/interlaced mixed up . 1080i is an interlaced signal, which means that only evey other line of an image (the odds and evens you refer to) are drawn per frame, and each one of those frames only contain 1920x540 lines. Hence it is the current standard for broadcast HD content as it only requires half the data bandwidth.

 

What you describe as "2 update passes to refresh one frame" is the processing that the TV receiving the signal carries out before displaying the image - a process also known as 'line doubling', where basically the TV processes each pair of consecutive odd/even 1920x540 frames and combines them into a single frame that it can display progressively. This is why a native progressive feed is generally perceived as 'smoother', as the line doubling process has the effect of halving the frame rate of a 1080i signal. So if you had say 1080i @ 60fps, what would be displayed after line doubling would be 1080p @ 30fps. Additionally, due to each odd/even frame being slightly different (or quite a lot different in very fast moving scenes), line doubling produces artefacts where you can see the visible side effect of two halves of two different images being hashed together - the degree to which you see that depends on how good a job your TV's processor does of masking it.

 

But the bottom line is, half the amount of data does equal half the amount of picture. That is counteracted by the process above to create half as many full frame images, but cannot produce the same results as a proper 1080p feed (i.e. Blu-Ray movies) where the entire image is drawn for every frame. So to summarise, 1080p @ 60fps feed really is 60 full images drawn every second at 1920x1080 but 1080i @ 60fps is 60 half images drawn every second at 1920x540, which your TV processes to produce 30 full images drawn every second at 1920x1080, but the result of the latter is a lower quality image.

Message Edited by SpikyCat on 23-08-2008 05:53 AM

What you goin on about ,i suggest you read my post again and this time try to understand it.

Youve just repeated what iv said more or less.
Message Edited by kingchaz on 23-08-2008 12:28 PM

No really, I did understand what you said, but what I'm saying is that, contrary to what you said, it is not actually wrong to say a 1080i feed is 1920x540, because it is made up of frames of that resolution. Each pair of 1920x540 frames only becomes a single 1920x1080 frame as a result of the TV's picture processing. And in fact, on a 720p/768p native panel, it will be downscaled anyway, so 1080i becomes irrelevant. There are some old HDTVs that have a native resolution of 1920x1080 but can only handle interlaced rather than progressive signals, so inputting a 1080i signal to one of those TVs would result in an interlaced image made up of successive frames of 1920x540.

 

Might be technicalities, but people get easily confused about these things and end up convincing themselves that they're seeing extra detail with 1080i. :smileywink:

Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!


Check out my blog: http://www.spikycat.com/
Beta Tester
02
/16

ElectricJuice

  • Beta Tester
  • Offline
80%
02 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
ElectricJuice
Accepted Solution

Im not going to go in much detail as i dont have time, but to put it bluntly, 1080i is really kind of pointless.

 

The 'p' will always be better than 'i' when watching movies or playing games. When the TV has messed about with all the scaling it has to do, you'll just end up with a worse picture anyway

Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

05
/16

kingchaz

  • Offline
94%
05 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
kingchaz
Accepted Solution

SpikyCat wrote:

kingchaz wrote:

SpikyCat wrote:

kingchaz wrote:

willj12 wrote:

TrueSlawter wrote:

willj12 wrote:

Barso wrote:

I would really like an option on the XMB that automatically selects 1080i when I play a blu-ray movie and am using the XMB and when I put a game it switches to 720p.

I just don't like having to select it myself from the display options and having to switch back when I start to play a game.


if yor tv is 720p/1080i it will have 1366x768 pixels, so using 1080i (1920x1080) requires more scaling and will look worse than 720p (1280x720). Also 1080i is interlaced, depending on how good the video processor in your tv is at deinterlacing it will also make it look woese.




i thaught 1920x1080 is 1080p? & 1680 x 1050 is 1080i? if im wrong what is 1680 x 1050?

yep 1080p is 192x1080 and 1080i is 1920x540


Wrong,there both 1920x1080 ............

1080i is technically a higher resolution in terms of the amount that is displayed per frame (NOT to be mixed up with 'per field refresh'). However, thanks to the fact that it refreshes odd and then even fields per frame update (going through 2 update passes to refresh one frame), the actual real realtime representation of the data is inferior. The total frame is still higher resolution, but since the frame is consisted of 2 sets of alternate field refreshes, the detail is ruined because of the alternating fields.

You seem to be of the impression that 1920x1080i basically equates to the same as 1920x540p in terms of detail - this is false. You seem to miss the fact that the data stored in each alternate field refresh is different, and the 2 field refreshes ultimately make 1 single frame. It is the manner of refresh that is different.
What you are saying would only make sense if the frame was only made of one of the field passes (which would hold half the detail/lines of the entire frame), which we know is not the case.


 

That's not right either - you're getting your frame rates, refresh rates and progressive/interlaced mixed up . 1080i is an interlaced signal, which means that only evey other line of an image (the odds and evens you refer to) are drawn per frame, and each one of those frames only contain 1920x540 lines. Hence it is the current standard for broadcast HD content as it only requires half the data bandwidth.

 

What you describe as "2 update passes to refresh one frame" is the processing that the TV receiving the signal carries out before displaying the image - a process also known as 'line doubling', where basically the TV processes each pair of consecutive odd/even 1920x540 frames and combines them into a single frame that it can display progressively. This is why a native progressive feed is generally perceived as 'smoother', as the line doubling process has the effect of halving the frame rate of a 1080i signal. So if you had say 1080i @ 60fps, what would be displayed after line doubling would be 1080p @ 30fps. Additionally, due to each odd/even frame being slightly different (or quite a lot different in very fast moving scenes), line doubling produces artefacts where you can see the visible side effect of two halves of two different images being hashed together - the degree to which you see that depends on how good a job your TV's processor does of masking it.

 

But the bottom line is, half the amount of data does equal half the amount of picture. That is counteracted by the process above to create half as many full frame images, but cannot produce the same results as a proper 1080p feed (i.e. Blu-Ray movies) where the entire image is drawn for every frame. So to summarise, 1080p @ 60fps feed really is 60 full images drawn every second at 1920x1080 but 1080i @ 60fps is 60 half images drawn every second at 1920x540, which your TV processes to produce 30 full images drawn every second at 1920x1080, but the result of the latter is a lower quality image.

Message Edited by SpikyCat on 23-08-2008 05:53 AM

What you goin on about ,i suggest you read my post again and this time try to understand it.

Youve just repeated what iv said more or less.
Message Edited by kingchaz on 23-08-2008 12:28 PM

No really, I did understand what you said, but what I'm saying is that, contrary to what you said, it is not actually wrong to say a 1080i feed is 1920x540, because it is made up of frames of that resolution. Each pair of 1920x540 frames only becomes a single 1920x1080 frame as a result of the TV's picture processing. And in fact, on a 720p/768p native panel, it will be downscaled anyway, so 1080i becomes irrelevant. There are some old HDTVs that have a native resolution of 1920x1080 but can only handle interlaced rather than progressive signals, so inputting a 1080i signal to one of those TVs would result in an interlaced image made up of successive frames of 1920x540.

 

Might be technicalities, but people get easily confused about these things and end up convincing themselves that they're seeing extra detail with 1080i. :smileywink:


 

Agree on this we will.:smileyvery-happy:
Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

03
/16

M_G

  • Offline
80%
03 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
M_G
Accepted Solution

I dont think 1080i displays moving things on the screen as well as 720p.

 

I think alot of people get confused with with 1080i and 1080p

Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

02
/16

bulletmunch

  • Offline
60%
02 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
bulletmunch
Accepted Solution
y is it most games are 720p and not 1080i
Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

02
/16

zyxses

  • Offline
88%
02 Rank
Progress
Playstation Staff

Problem Solved

View Original Post
zyxses
Accepted Solution
Me also. Preset PS3 to play games/browser at 720p and movies 1080p...how hard can it be?
Reply
0 Kudos

Re: 1080 for movies, 720 for gaming!

"we are all in the gutter, but some of us are looking at the stars"
Oscar Wilde
Advanced
You must be signed in to add attachments
View our house rules for posts
Post Reply Update Reply Cancel