System configuration:
Motherboard: Gigabyte GA-H55M-UD2H
Processor: Intel core i3 540
O/S: Windows 7 x64
TV: Samsung Series 7 63" Plasma 3D
Connection: HDMI
The Intel video chipset recognizes the TV as 1920 x 1080 32 bit 60Hz. The TV sees an incoming signal on the HDMI of 1920 x 1080 @60Hz. But for some reason the desktop is larger than the TV display area, both horizontally and vertically. To fix this I can adjust the H and V scaling options to 70%, or I can configure the TV display from 16:9 (1920 x 1080) to "Fit to Screen", either of these will reduce the desktop to fit the display area of the TV.
But I don't understand. If HDMI is digital then it should be sending pixel information of 1920 x 1080, and this would match the native resolution of the display, so everything should be displayed perfectly. But it isn't. So it would appear either:
a) the native resolution of the TV is not 1920 x 1080, it is less hence the larger desktop display
b) the desktop under windows is not 1920 x 1080, it is larger
c) the signal is not digital but rather analogue and thus the voltage levels could be larger than what the TV expects.
And I'm not sure what the scaling adjustment does on the video chipset .. does it reduce the number of pixesl, or does it reduce the analog voltage (which doesn't make sense because the HDMI interface is supposed to be digital).
Anyone got some ideas what is wrong here?
Motherboard: Gigabyte GA-H55M-UD2H
Processor: Intel core i3 540
O/S: Windows 7 x64
TV: Samsung Series 7 63" Plasma 3D
Connection: HDMI
The Intel video chipset recognizes the TV as 1920 x 1080 32 bit 60Hz. The TV sees an incoming signal on the HDMI of 1920 x 1080 @60Hz. But for some reason the desktop is larger than the TV display area, both horizontally and vertically. To fix this I can adjust the H and V scaling options to 70%, or I can configure the TV display from 16:9 (1920 x 1080) to "Fit to Screen", either of these will reduce the desktop to fit the display area of the TV.
But I don't understand. If HDMI is digital then it should be sending pixel information of 1920 x 1080, and this would match the native resolution of the display, so everything should be displayed perfectly. But it isn't. So it would appear either:
a) the native resolution of the TV is not 1920 x 1080, it is less hence the larger desktop display
b) the desktop under windows is not 1920 x 1080, it is larger
c) the signal is not digital but rather analogue and thus the voltage levels could be larger than what the TV expects.
And I'm not sure what the scaling adjustment does on the video chipset .. does it reduce the number of pixesl, or does it reduce the analog voltage (which doesn't make sense because the HDMI interface is supposed to be digital).
Anyone got some ideas what is wrong here?