As it turns out, Windows Vista really wasn’t all that slow; and no, your PC probably won’t fry if you open it up without wearing a wrist strap. Thanks in large part to the Internet, the tech world is teeming with lies, half-truths, and misinformation. We’ve dug up some of the Web’s most notorious nuggets of conventional wisdom to see which hold up to scrutiny and which are merely urban legends.
Of course, there’s often a grain of truth in even the most fanciful myth. That’s why we provide a handy-dandy set of numbered warning signs to indicate how accurate each of these myths is, with 1 being True and 4 being Outrageous–a complete fabrication. After all, they say numbers never lie.
The claim: Vista is slower than Windows 7
When Windows Vista came out, it soon acquired a reputation for being slow and a resource hog. Once Windows 7 arrived, people were quick to tout it as the speedy, slim operating system that Vista should have been.
We conducted performance tests on a handful of laptops and desktops using both 32-bit and 64-bit versions of Vista and Windows 7, shortly after the latter OS was released. While results varied across configurations, a few trends stood out. Windows 7 raised WorldBench 6 scores from 1.25 percent to almost 10 percent (but most often in the vicinity of 2 to 3 percent); it also resulted in much faster disk operations (in Windows 7 our Nero disc-burning software tests ran twice as fast on an IdeaPad laptop, and 2.5 times as fast on a Gateway laptop), and in slightly longer battery life (the IdeaPad lasted only an extra minute; the Gateway got an extra 15 minutes).While Windows 7 did seem to speed things up somewhat, a few tests actually showed some slowdown. Applications launched more slowly across the board (the most dramatic change was a 2.7-second Photoshop CS4 launch in Vista turning into a 9.6-second launch in Windows 7), and the Gateway laptop saw a slight increase in startup time (39.6 seconds in Vista; 43.6 seconds in Windows 7).
As it turns out, the “snappy” feeling Windows 7 engenders has to do with Registry tweaks and minor changes to the window manager that make the OS feel more responsive, even though it isn’t that different.
The verdict: Windows 7 is faster, but not by as much as you may think.
(Warning: 2, mostly true)
The claim: All smartphones suffer signal loss from a ‘Grip of Death’
When early iPhone 4 adopters discovered that touching a certain spot on the exposed antenna could cause the phone to lose signal strength, reduce data speeds, and even drop calls, Apple insisted that all smartphones suffered from a similar defect.
We tested that claim with five different smartphones. We looked at RF signal strength, data speed rates, and call quality in areas with weak and strong signals.
While every phone we tested was affected by a “grip of death,” none went so far as to drop calls, as the iPhone 4 did. Bottom line: If you don’t have an iPhone 4, you don’t need to worry too much about this antenna issue.
(Warning: 2, mostly true)
The claim: The desktop PC is dying
Sure, laptops are cheaper and more powerful than ever, and can meet all your basic computing needs. But saying that the desktop is on its deathbed is like saying that, since all most people need is a Geo Metro, the pickup truck is obsolete. Power users who need desktop-caliber performance in a laptop must pay a significant premium, and if they want a Blu-ray drive, a better GPU, or a 3D display, they must buy a new model. Also, people who like to tinker with their PCs have fewer options with laptops than they do with desktops.
Meanwhile, the desktop PC market is evolving to meet users’ demands. People who want a larger display but don’t like the looks of a tower can buy an all-in-one system. Others want a computer that fits nicely next to their 50-inch HDTV–a home theater PC. And students, who typically benefit most from a laptop, can buy both a solid all-in-one PC for gaming and movies (ahem–“multimedia projects”) and a cheap, lightweight netbook for taking notes in class for the same price as a single moderately powerful laptop (which would be more expensive to replace if it were broken, lost, or stolen).
(Warning: 3, Dubious)
The claim: High-priced HDMI cables make your HDTV look better
When you plunk down $1,200 (or more) for a new HDTV and $300 for a Blu-ray player, it can be easy for a salesperson to guilt you into tacking a $150 HDMI cable onto your purchase–after all, your brand-new gear needs a good cable to get the image quality you’re paying for, right? If you’re lucky, you’ll have the alternative of buying the “cheap” store-brand cable, at a cost of only $30 and a disapproving look from the cashier. Well, feel free to take that $150 and spend it on popcorn for the movies you’ll be watching–your HDTV won’t care which HDMI cable you use.
High-quality cables have been a staple of the audio/video business for decades now, and for good reason: As an analog audio or video signal travels from one device to another, it’s susceptible to interference and disruption, meaning that the image data as it leaves your DVD player isn’t 100 per cent identical to the image that shows up on your TV, because certain parts of the signal can get lost on the way there.
However, digital audio/video standards like DisplayPort, DVI, and HDMI don’t have this problem because the data being transmitted over the cable isn’t as sensitive as an analog signal; it consists entirely of ones and zeros, and a tremendous drop in signal voltage has to occur before a one starts to look like a zero at the receiving end. When this does happen, you’ll usually see some kind of white static “sparklies” on your TV, as the set attempts to fill in the blanks itself, but this typically happens only over very long HDMI runs (8 meters and up). For shorter cables, the cable quality shouldn’t matter.
That explanation rarely succeeds in silencing the home-theater enthusiasts (and home-theater salespeople) who swear that they see a difference between the good stuff and the cheap stuff, so we decided to check them out ourselves to see whether cost made a difference. We tested two pricey HDMI cables–the Monster HD1000 ($150) and the AudioQuest Forest ($60)–against a couple of bargain-basement cables from Blue Jeans Cable (the 5001A-G, $5) and Monoprice (the 28AWG, $3.04).
After testing different kinds of high-def video clips (including clips of football broadcasts and selections from The Dark Knight on Blu-ray), we ended up with all four cables in a dead heat: Blue Jeans Cable, Monoprice, and Monster all saw an average rating of 3.5 out of 5, with AudioQuest trailing ever so slightly at 3.4–close enough to practically be a rounding error. So save your money and stick to the cheaper cables unless you need the cables to cover a long distance.
(Warning: 4, outrageous)
The Claim: LCDs are better than plasma screens for HDTV sets
Don’t believe the hype: Your local HDTV salespeople may be trying to upsell you on a spiffy new LCD, but there are plenty of reasons to pick a plasma instead. Plasmas still handle darker scenes better, have a wider range of viewing angles, and are generally cheaper than LCDs (especially at larger sizes).
Panasonic and Samsung continue to manufacture plenty of plasma sets (including a line of home 3D TVs and a gigantic, superexpensive 152-inch 3D display). You can read more about plasma vs. LCD displays.
LCDs are catching up in a few respects, however. LCD sets with LED backlighting and higher refresh rates can compensate for some of the traditional problems of LCDs, and they suck up significantly less power than plasma sets do, so the higher price may be offset over time in your electricity bill.
Despite the remaining advantages of plasma, it’s worth noting that some manufacturers are dropping out of the plasma display market (Pioneer, most notably, and Vizio), and California plans to ban power-hungry TVs–so the writing is undeniably on the wall: Plasma isn’t dead yet, but it may be finished in a few years.(We have more on HDTV myths here.)
(Warning: 3, dubious)
The claim: More bars on your cell phone means better service
The signal bars on your cell phone display indicate the strength of your cellular signal to the nearest tower. But if you’re connected to a tower that lots of other people are connected to, you could have a strong signal and still have poor service, since everyone’s calls are competing for scarce network resources. Once your information arrives at the cellular tower from your phone, it has to travel through your service provider’s backhaul network (which connects the tower to the Internet). And if your provider’s network isn’t up to snuff, you could have a flawless connection to an empty cell tower, and yet still encounter poor speeds and dropped calls.
When we tested 3G service in 2009, we found that signal bars were poor indicators of service quality in 12 of the 13 cities in which we tested. In San Francisco, for one, signal bars correlated with service quality in only 13 percent of test results. Additionally, if you use an iPhone, you might just be seeing inaccurate readings. Apple recently announced (in connection with the iPhone 4 antenna issue) that the formula it had been using in all iPhones to display signal strength was “totally wrong” and often reported the signal as two bars higher than it should have. Oops.
(Warning: 3, dubious)
The claim: Over time, inkjet printers are much more expensive than laser printers
To figure out how much a printer’s consumables will cost you over time, you take the price of the ink or toner cartridge and divide by the estimated page yield per cartridge, for your cost per page.
Traditionally, laser printers have had a higher initial purchase price, which was balanced by their lower cost per page versus inkjet printers.
However, as inkjet printer manufacturers began to release more efficient models (ones with separate ink tanks for each color, or higher-yield cartridge options), the cost-per-page gap has closed dramatically. Businesses needing cheap, fast printers, for example, could do well with either the Epson B-510DN inkjet (1.3 cents per black text page, 14.7 pages per minute, $600 retail price), or one of the more economical laser printer models, such as the Oki C610dtn (1.1 cents per black text page, 19.1 pages per minute, $700 retail price). Home users and students have fewer options–paying less for the printer means paying more for the ink. To its credit, the Canon Pixma iP4700 (2.7 cents per black text page, 7.4 pages per minute, $100 retail) has reasonably priced inks.
Keep in mind that the inkjet printers you see going cheap with big mail-in rebates or included with laptop purchases generally aren’t the type that can hang with a laser printer in speed and costs. Instead, you’ll end up paying more in the long run via expensive, low-yield ink cartridges–to the point where it can even be cheaper to buy a new printer than to refill the ink in your old one.
(Warning: 4, outrageous)
The claim: People with more monitor space are more productive
Begging your boss for an extra display at work? You might sell her on the idea if you tell her that you’d be 30 to 50 per cent more productive than you are on your single 18-inch display. At least, that’s what a 2008 study from the University of Utah (commissioned by NEC, mind you) found for text and spreadsheet tasks.
NEC, naturally, was quick to trumpet the results as a way to move more of its widescreen displays. However, the study also found a point of diminishing returns. Productivity gains fall in a bell-curve distribution once you hit a certain amount of screen space. For a single-monitor setup, over 26 inches is too much, while dual-display gains top out at 22 inches.
In addition, the pattern of the results implies that while a second monitor can make you a wunderkind at work, don’t even think about adding a third. Interestingly, users’ reported preference did not predict their performance-that is, the setup they liked wasn’t necessarily the one they worked best with.
So think about what you’d be using that second display for. The University of Utah study took place in a controlled environment, where the subjects did nothing but the text and spreadsheet tasks they were assigned. If that sounds like your office, you’ll probably do great with a second monitor.
If you’re planning on using that second display for e-mail, Twitter, or other Internet-related distractions, however, you’re probably going to end up being less productive overall. (I certainly am.)
(Warning: 2, mostly true)
The claim: Refilled ink cartridges will ruin your printer
Taking your printer’s ink cartridge to a refill service can save you a few bucks. But because cartridges aren’t designed to be reused, refilling has risks: Nozzles could clog, or the ink tank could spring a leak. A good rule of thumb is to monitor the cartridge closely so you can prevent damage to it–or to your printer–if something goes awry. That way, though the cartridge or printhead might be a goner, you are unlikely to cause any permanent damage to the printer itself, unless the cartridge leaked and you didn’t clean it up.
Note that refills done by a third party typically come with a guarantee that covers the cartridge (which may cost anywhere from $10 to $20)–but not necessarily the printer. The Cartridge World ink refill chain, for example, guarantees to repair a faulty cartridge or credit the cost against a new cartridge, but if your printer bites the bullet, the company can only “provide advice or a qualified service technician to address any issues.”
Refill companies also like to remind you that it is illegal for your printer manufacturer to void the warranty on your printer for using third-party cartridges. True enough, but warranty agreements we’ve seen suggest that if a refill cartridge breaks your printer, you shouldn’t expect a free fix. For example, the HP warranty agreement explicitly states:
“For HP (NYSE: HPQ) printer products, the use of a non-HP ink cartridge or a refilled ink cartridge does not affect either the warranty to the customer or any HP support contract with the customer. However, if printer failure or damage is attributable to the use of a non-HP or refilled ink cartridge, HP will charge its standard time and materials charges to service the printer for the particular failure or damage.”
If you’re worried about leaks, pull the cartridge out of the printer occasionally to see if any excess ink is pooling near where the cartridge rests in the printer.
(Warning: 2, mostly true)
The Claim: Internet Explorer is less secure than other browsers
Everyone “knows” that Chrome, Firefox, and Safari are all way more secure than Internet Explorer. But what’s the real story?
To find out, I first looked up Symantec‘s (NASDAQ: SYMC) twice-yearly Internet Security Threat Report, which yielded the total numbers of reported vulnerabilities for 2009: Firefox had the most at 169, followed by 94 for Safari, 45 for IE, and 41 for Google Chrome. For more-recent data, I turned to the United States Computer Emergency Readiness Team (US-CERT), which hosts the National Vulnerability Database, a searchable index of reported computer vulnerabilities. A search of data for a recent three-month period yielded 51 such vulnerabilities for Safari (including both mobile and desktop versions), 40 for Chrome, 20 for Firefox, and 17 for IE.
Such counts alone aren’t the best way to measure a browser’s security, however. A browser with 100 security flaws that are patched a day after being discovered is safer than a browser with only one exploit that hasn’t been patched for months.
According to Symantec’s report, the average window of vulnerability (the time between when the flaw is reported and when it’s patched) in 2009 was less than a day for IE and Firefox, 2 days for Google Chrome, and a whopping 13 days for Safari. Clearly, Internet Explorer is doing fairly well. Nevertheless, you should still consider a few important factors before deciding to jump ship back to IE.
Stay updated. The second most common Web-based attack in 2009 exploited an IE security flaw patched way back in 2004 (the 2009 attack targeted unupdated PCs). The latest version of IE 8 may be pretty safe, but ditch any earlier version you have.
Your browser is only as secure as your plug-ins. Symantec found that Microsoft’s ActiveX plug-in (enabled by default in IE) was the least secure with 134 vulnerabilities, followed by Java SE with 84, Adobe Reader with 49, Apple QuickTime with 27, and Adobe Flash Player with 23. The moral: Be careful at sites that use browser plug-ins.
It’s tough to be on top. IE still has the biggest piece of the browser pie, meaning that cybercriminals are more likely to target IE than other browsers.
(Warning: 4, outrageous)
The claim: You’re safe if you visit only G-rated sites
If your PC has ever had a virus, you probably know about the raised-eyebrow, mock-judgmental looks you get when you tell that to other people. After all, if you had been a good little PC user and stayed in the G-rated Web, you would have been safe, right?
Not so, says Avast Software, makers of Avast, a popular antivirus suite. “For every infected adult domain we identify, there are 99 others with perfectly legitimate content that are also infected,” its chief technology officer, Ondrej Vlcek, reports. In the United Kingdom, for example, users are far more likely to see infected domains with London in the name than sex.
So porn alone doesn’t necessarily mean you’re opening yourself up for infection. Which makes sense–porn-site operators depend on subscriptions and repeat visitors to do business, and infecting your customers with spyware isn’t the best way to do it.
If you find yourself on a generic-looking Website with popular search keywords in the title, or a site that’s rearranging your browser window, you’re likely to end up stuck with some malware–whether it’s about porn or about hotels in London.
(Warning: 4, outrageous)
The claim: You should regularly defragment your hard drive
Your hard drive has to decide where to write your files on the drive platter, and as you fill up the drive, your files will be scattered more and more widely across the platter. This means that the drive’s read/ write heads take longer to find the whole file, since they take more time skipping around the platter to find the different parts of the fragmented file. However, this state of affairs isn’t an issue these days, for several reasons:
Hard drives are bigger. When your hard drive capacity was measured in megabytes, fragmentation was a big deal. Not only did the drive’s read/write heads have to move all over the platter, but the space freed up by deleting old files was also scattered, and new files could be dispersed across the small gaps between larger files.
People now generally have more hard drive space and use a smaller overall percentage of their drive, so the read/write heads don’t have to move as much.
More RAM and optimized OSs help. Newer iterations of Windows have done a lot to reduce the impact that a fragmented hard drive can have on a PC’s performance. According to the engineers who worked on Windows 7’s updated Disk Defragmenter tool (see the screenshot above), Windows’ file system allocation strategies, its caching and prefetching algorithms, and today’s relative abundance of RAM (which permits the PC to cache the data actively in use rather than having to write repeatedly to the drive) minimizes fragmentation delay.
Solid-state drives don’t need to be defragmented. SSDs don’t have a drive platter or read/write heads that need to go searching around the drive. In fact, defragmenting is generally not recommended for SSDs because it wears down the hard drive’s data cells, shortening the drive’s overall lifespan.
You don’t need to go out of your way to defrag. In Windows Vista and Windows 7, the system automatically handles defragging. By default, defragging happens at 1:00 a.m. every Wednesday, but if your PC isn’t on or is in use, the process will occur in the background the next time the machine is idle. It will