Skip to main content

Popular Myths When Choosing Components

I often hear the same rumors or half-truths when helping someone build their first PC. They read X is better than Y in some article or blog, and now, it's stuck in their brain. I do have certain brands I tend to like above others, but I've been building these things long enough now to see trends change. Today's "facts" become tomorrow's untruths on a fairly regular cycle. The following are some things I've heard or discussed directly or through forums with a number of people that are patently wrong. Try to avoid these when putting together the components for your build.

Intel processors are better than AMD processors (for gaming)
I've been building PCs long enough to remember when AMD blindsided Intel with the release of the AMD Athlon K7 in 1999. (At least, it seemed like Intel was caught with its pants down.) AMD's architecture allowed them to perform more processing at the same clock speed, making the K7 the fastest CPU available. The Thunderbird core that followed was faster still and clock speed was no longer the only measure of a CPU's performance. The Athlon 64 that came after that kept AMD in the lead. I built gaming systems on AMD CPUs until Intel released its Core 2 systems in 2006. I kept a tiny light lit hoping it would happen again, and finally in 2017, it seems to have. At least, AMD is now competitive even near the top end.

This is a rumor that is not completely false. It is true (currently - and for the last five years or so) that at the top end of the cost and performance scale, Intel processors have a measurable advantage over AMD processors. That is, if you're planning on spending at or over $250 for the CPU alone, than one of the Intel Core i5 or i7 processors is the one to pick. However, if you're budget limit for the CPU is around $150, picking Intel or AMD CPU becomes less clear cut. The best approach is to figure out the maximum you're willing to spend and then figure out the best CPU - Intel or AMD - that's within that budget.

Tom's Hardware is one of my favorite sites for doing research when working up the specifications for a new system. Recently, they added a Best Gaming CPU for the Money monthly column. That's a great place to start. Go to the CPUs section of the site and look for the latest column. The CPU dictates what motherboards you can get, so pick that first.

Nvidia graphics cards are better than ATI graphics cards (or vice versa)

This is a rumor that I wish I felt was more of a rumor. In sheer number-crunching, Nvidia and AMD have done a pretty good job of slotting their video cards into a line of price/performance. It should be as simple as figuring out what your budget is and then buying whichever manufacturers card you can get within that range. The truth - for me at least - isn't that simple. For me, if the "correct" choice is an AMD card, I balk. If it's at all possible to bump up to the next highest Nvidia card from that initial choice, I'll take it every time. Why? The answers is video drivers and build quality.

I have had and still have at least one AMD card in my systems. It is not, however a gaming system. It's a Linux box that has an AMD card typically used for home theater PCs. My last "gaming" AMD card was actually an ATI Radeon (R300) 9700 (from 2002). It had to be replaced because of bad capacitors on it resulting in displaying the pink checkerboard of death whenever trying to play a game. I got it direct from ATI, so I had to ship it to Canada for replacement. I remember this because Canada wanted me to pay an import tax for the declared value.

More recently, a friend tried to use an card made by MSI based on an AMD R9 290X GPU. It was the correct card for his budget, and the R9 290X has decent reviews. His first card booted fine, but would lock up whenever he tried a game. Sad, but any manufacturer can have an occasional issue, so he got a replacement. The second one had the same issue. This time, we tried it in my gaming rig thinking it might be some issue specific to his system. It got a little farther, but running 3DMark locked the system up within a minute or so. Eventually, that one died so bad, I couldn't even get back into Windows long enough to uninstall it. The AMD Catalyst drivers are so bad, I blue-screened when I tried to put my actual video card (by Nvidia) back into the system. I had to nuke and pave my OS to get the system working again. This could have been an MSI issue with AMD cards, perhaps. That said, we replaced that card with an MSI card based on the Nvidia 970 GTX. It cost more, but now, there are no problems with it in his system. That leaves me rather soured on AMD cards.

The bottom line is technically, you shouldn't necessarily pick Nvidia or AMD as always being the best. Look and see what makes sense for the budget available. Tom's Hardware also has a GPU section like the CPU section and a publishes similar articles entitled, Best Graphics Cards For The Money for the current month.

Installing a larger power supply means my system will use more power

This is a question I've answered online on more than one occasion. I have taken the liberty of copying myself. I'll sue myself for infringement later.

Computer power supply units (PSUs) are on-demand current draw devices. That is, they only supply as much power on the various voltage lines (3.3 V, 5V, 12V, etc.) as the components in your PC require. As such, if you were to replace your current power supply with a larger rated one (without changing any other components in your system), the difference in the current draw should be negligible. More than that, if you replace an old, poorly-designed 300W PSU with a new, more efficient 550W model, it's even possible the current draw will be decreased not increased due to increased efficiency. Efficiency is the ratio of power consumption from the wall socket compared to the power delivered to the computer components. A loss of efficiency manifests itself as heat generation. A PSU that is 85% efficient wastes less electricity in the form of heat than a 70% efficient PSU.

Power supplies tend to operate most efficiently when they are being driven at 50 - 75% of their rated maximum load. Let's say you've been adding hard drives over time (even external ones if they are powered by the USB port) and have upgraded your video card as well. The 300W power supply was fine when you first got your system, but now, let's say you are using 260W as a worst case. (It won't always draw that much, but when playing a video game that is driving the graphics card and the CPU hard, it may stay at that draw for extended periods of time.) Your 300W PSU is being forced to operate at 87% of its rated max. A 550W PSU on the other hand would only be operating a 47% of its rated max. The 300W power supply - because of the loss in efficiency converting 120V AC current to 3.3V, 5V, 12V, etc. DC current when loaded above 75% - may require more current from the wall to deliver 260W than the 550W supply would require to do the same. (It's more complicated than this in that it matters how much current is needed by each of the voltage "rails" such as 12V compared to 3.3V rather than just the total power. I've also ignored talking about thermal design power altogether.)

That said, it's a good idea to check the output of an existing PSU and upgrade it when adding components with a higher current draw - which is most often a new video card. Some of the very high end graphics cards now require 250W or more when they are running full blast. Put a pair of those in an SLI motherboard and you see why 1000 W supplies are needed. (You are potentially at 50% load of the PSU with the graphics cards alone.) So, if you do upgrade your graphics card, chances are you probably will draw more current from the wall. However, that's not the function of adding a new PSU. If you want to verify or measure this, purchase a cheap Kill-A-Watt power meter and check the amperage used before and after replacing the power supply. The difference should be barely noticeable. Try the same after upgrading a graphics card to one that uses more power, and there will be a difference.