login about faq

To prove you're not a spammer, email newuser.lgqa@gmail.com with the subject "Account Request" to request an account.


I've been through the settings and the option for 1600x900 doesn't exist. It is the native resolution of my new monitor. I tried updating the driver to my graphics card (ATI Radeon HD 3400) but that only screwed it up further. Now I can't use the next best resolution (1440x900) anymore when I could before I updated the driver. The monitor is a Dell ST2010B. It's currently connected through VGA. I've tried using HDMI in the past with a different monitor and it does not work at all.

asked Jun 06 '10 at 20:38

webbupdates's gravatar image

webbupdates
1223

edited Jun 06 '10 at 20:38


Have you tried specifying that your monitor is a Dell ST2010B in Windows?

Download and install the drivers for your monitor on Dell's website.

If the driver install didn't set your driver properly, work from left to right in the following screenshot: Image steps on how to change monitor driver

  1. Go to your display properties by right-clicking the desktop and selecting Change Resolution.
  2. Click Advanced settings.
  3. Go to the Monitor tab and click Properties.
  4. Go to the Driver tab and click *Update Driver....
  5. Select Browse my computer for driver software, then Let me pick from a list of device drivers on my computer.
  6. If the driver does not show initially, uncheck Show compatible hardware.
  7. Find the driver, select it, and press Next and continue.

You may need to reboot for the resolution to become present.

answered Jun 06 '10 at 21:13

strager's gravatar image

strager
656102034

Update: I followed your advice and updated the driver for my monitor, but it didn't work. However, somehow I was able to get 1440X900 to work again.

(Jun 07 '10 at 23:03) webbupdates webbupdates's gravatar image

@webbupdates, Do you mean the driver didn't install properly, or the resolution didn't appear after installing the driver?

You can also try lowering the refresh rate (to e.g. 60Hz) and then changing the resolution. Sometimes Windows does weird things and isn't smart enough to adjust the refresh rate when changing resolution.

Also, try going into your nVidia Control Panel / ATI Catalyst. Look for reading EDID information from your monitor. Also look for manually setting the resolution.

(Jun 08 '10 at 07:58) strager strager's gravatar image

The resolution didn't appear after installing the driver. I tried what you said and checked the thing where it does not automatically detect the monitor in Catalyst. My monitor started blinking on and off and I had to shut down my computer with the power button. Now every time I open Catalyst, my monitor blinks on and off and I have to hit the power button on my computer.

Edit: OK. One of the monitor blinks stopped after I tried again and I was able to put it back to automatically detecting my monitor. I also checked to make sure the option for 1600x900 wasn't there and it wasn't.

(Jun 09 '10 at 01:51) webbupdates webbupdates's gravatar image

First quality answer Ive seen yet. You are awesome.

(Jun 09 '10 at 02:11) blackbird307 blackbird307's gravatar image

I've fixed the problem! I just converted from VGA to HDMI. It automatically went to 1600X900 when I booted up. An added bonus: The Screen looks much clearer!Now I just need to go buy another HDMI cable, as I've already listed an auction to sell the only one I have with my PS3 on eBay. lol

(Jun 09 '10 at 03:26) webbupdates webbupdates's gravatar image

@webbupdates, That's good to hear! When I bought this monitor, it came with a bad VGA cable, and the picture was completely blurry. I tried the BNC input, too, but it didn't seem to want to reach higher resolutions (1600x1200 was as high as it would go). I swapped out the VGA cable, and I got a great picture and all the resolutions I could ever need. Glad you got your monitor working. =]

@blackbird307, That's sad to hear, and it's one of the reasons I am beginning to dislike this community and what direction it is headed. =[

(Jun 09 '10 at 07:31) strager strager's gravatar image
showing 5 of 6 show all

@webbupdates that was exactly my thought that you used VGA that isn't nearly as good as HDMI. Note: always buy a more expensive HDMI cable! Trust me, it's worth it.

answered Jun 09 '10 at 07:46

Mihkel's gravatar image

Mihkel
4.6k4258114

Hello Guys.. Alok here..

I'm facing the same kinda problem. Using Dell Monitor IN2030M on VGA (Standard VGA Graphics Adapter). No DVI-D cable.. The system config is Windows 7 Ultimate, 32-bit, Intel Pentium 4 CPU 3.06GHz, 1.5 GB Ram, Mercury Mother Board. It's an assembled system.

Recently bought the Dell Monitor and the option for 1600x900 doesn't exist. It is the native resolution of my new monitor. Could anyone help??

answered Mar 20 '13 at 18:06

LiveStrongSpark's gravatar image

LiveStrongSpark
1

Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or __italic__
  • **bold** or __bold__
  • link:[text](http://url.com/ "title")
  • image?![alt text](/path/img.jpg "title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported


Tags:

×283
×281
×247
×38

Asked: Jun 06 '10 at 20:38

Seen: 20,127 times

Last updated: Mar 20 '13 at 18:06