Over the years I have fixed or repaired many a computer system, involving all hardware from RAM to motherboards. On laptop systems I basically limited myself to adding RAM, replacing a hard disk and a CD/DVD player. So this morning when I read this article over at PC World about replacing a laptop video display adapter on a laptop, I had to take a look.
The articles itself is a slide show that shows the procedure the user completed on a Dell laptop. The basics of removing the screen, keyboard and internals were covered. But when it came to the last step of replacing the video adapter the show and tell ceased. Instead the writer just stated:
Step 5: Replace Graphics Card, and Close
On our laptop, the preceding steps gave us access to the graphics/video card assembly, which we removed by loosening two screws. Then, after installing our new card, we reversed the procedure to put everything back together as before, and fired up the laptop.
New card installed, the system gave us basic video with standard VGA drivers, which kicked in automatically. After we downloaded the proper video driver from the vendor’s Web site, we enjoyed full resolution and color support.
Interesting. But I have a few questions. Has anyone tried this? Also, are the instructions kind of spotty to allow a user to complete the procedure? Or am I being to picky?
Another question. Don’t most laptops have the graphics built onto the motherboard instead of plugging in? It would seem to me that those can not be upgraded.
Let me know what you think.