Dell Tries to Help Customers with Misleading GPU Info

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

nordlead

Distinguished
Aug 3, 2011
692
0
19,060
41


Yes, Dell should have assigned a competent person to take the advertisement back to the tech department, but they didn't. I can almost guarantee it. No company wants to get caught in a blatant lie, and Dell got called out on this one FAST. The conversation probably went.

Advertiser: "Hey tech department, why should an average user buy a better graphics card?"
Tech Guy: <lost of mumbo jumbo> "... it looks better..." <more mumbo jumbo>
Advertiser goes off and creates add without going back to tech guy since he thinks he understands everything now

Yes, it would take 5 seconds for the tech guy to spot it, but they probably never gave him a chance because they think they know what they are doing. I have the same problem at my work where they tell me that it'll take me 1 week to replace the USB with UDP communications, and then I end up with 2 people working 5 months to re-write the entire program because they had no clue what they were asking for. Even if you tell them it'll take much longer they don't want to hear it so they ignore you.

I'm not saying that this is how it is supposed to be, just how it typically is.
 

jessterman21

Distinguished
This is unbelievable. It looks like they took the same image and blurred it + lowered the contrast using photo gallery. Good job Dell for misunderstanding your own tech. I guess ignorance is bliss?
 

njt

Distinguished
Apr 10, 2011
49
0
18,530
0
that's just an unfortunate byproduct of trying to sell hi-tech parts to people with no knowledge of that specific tech, and then trying to explain it to them as clearly and simply as possible, on a sales-oriented website rather than a technical one. the blurry desktop *was* rather dumb but then, who hasn't had a mental fart?
 

JeTJL

Distinguished
Oct 8, 2011
85
0
18,630
0
Both cards are adequately powered to play HD movies fluidly so that's hard to show the difference.

The Cards are very close together **Not sure about the clocks and the shaders on both of the cards compared to each other** meaning that Gaming if done on either of these cards would lead to fps benchmarks of around the same.


 
Corporations lie to sell products. Not sometimes, all of the time.
Corporations only care about making a profit to keep thier shareholders happy.

Take a look at the weight loss industry and all of the different dietary suppliments. They all claim that they will help but the only one recognized by the FDA to help is Alli. The rest are gimmicky ways that might help a little bit but not much and some are complete bullsh*t.

Remember "Focus Factor"? They were sued cause they had no proof that it worked. When you look at the label all it is, is a vitamin suppliment.

Anytime a product is being sold, start be being extremely skeptical and assume it's all bullshit until you've done plenty of research. There are so many ways a company can manipulate the advertising to make a product seem good.
 

fulle

Distinguished
May 31, 2008
968
0
19,010
14
[citation][nom]mayne92[/nom]...I read both and couldn't believe it was from the same person :p[/citation]

Just pushing an agenda however possible. In the 2nd quote, he was still pointing out the huge price difference, and just using the "blurry screen" as an excuse to make the post, which subtly pushes the pro Dell anti Apple agenda.

Clever, actually.
 

clonazepam

Distinguished
Jul 10, 2010
2,627
0
21,160
119
If you're not one to keep up with the latest trends and technology developments, buying a computer can be pretty stressful.
These people just need to submit to the sales experts. Explain exactly what you want to do with your computer, how quickly you want tasks to complete, etc, and the let the expert tell you what to buy. Never go into buying a computer with a specific price in mind, because then stress and second-guessing is inevitable.

A better analogy would have been to display a house/car/boat/plane built out of legos on one screen, and the same design built in real life on the other screen, keeping sharpness and clarity equal.
 
G

Guest

Guest
That's a horribly pathetic image ... I've seen plenty of misleading marketing, but this is just gross.
 

jgutz2006

Distinguished
Jul 7, 2009
473
0
18,810
18
Its no different than fast food companies showing perfectly shaped, perfect colors on their hamburgers as they are all fake or photo edited... Yes its misleading but why was my hamburger bun flat, in the commercial its a perfect rounded shape... damn it... you got me again
 

lp231

Splendid
Hmm... maybe I wasn't clear enough on my previous post?
The difference between standard and high-end (from Dell's glossary)
where "standard" I assume must be onboard and "high-end" is anything that's discrete even if it's low-end.

1. Both onboard and discrete can produce a nice image
2. Both can watch high-def movies
3. Both can are suitable for user who does basic task like internet, excel, and watch YouTube

Q: Suppose you are a seller, how can you show the difference to a buyer that the discrete is better if
they only do is run basic stuffs?
Does current Intel graphics support GPU acceleration?
If not then, they can demonstrate that so people see a real world difference.
Where a graphic card without GPU accleeration will have a high CPU usage while the one that does will have a lower CPU usage.
I guess GPU acceleration get's thrown out the window if onboard isn't Intel.
 

clonazepam

Distinguished
Jul 10, 2010
2,627
0
21,160
119
[citation][nom]lp231[/nom]Q: Suppose you are a seller, how can you show the difference to a buyer that the discrete is better if they only do is run basic stuffs?[/citation]

This is the best I could come up with on the spot.

[citation][nom]clonazepam[/nom]A better analogy would have been to display a house/car/boat/plane built out of legos on one screen, and the same design built in real life on the other screen, keeping sharpness and clarity equal.[/citation]
 

K2N hater

Distinguished
Sep 15, 2009
617
0
18,980
0
Yea, to prevent such false propaganda they should spot:
HD 6350: Good for web browsing, desktop animations, Blu-ray and DVD films.
HD 6450: Good for web browsing, desktop animations, Blu-ray and DVD films.

But then... How many would pick the 6450?

Not many, obviously.

So if there's something wrong to talk about let it be the with the cards chosen.
 
G

Guest

Guest
Well, "back in the days" there could actually be a sharper image and better colours with a higher quality gfx-card, especially at high resolutions (like 1024x768) and high refresh rates. This was dependant on the DAC chip used. Cheap cards used cheap, low bandwidth DACs while good cards used high quality DACs.
Today, all analog outputs are done with good enough DACs that you'd be hard pressed to notice any visible difference between them, even at really high resolutions. If you start connecting several monitors via passive splitter cables, though, you can still see big differences between cards but it is quite random which can handle the load and which can't.
 

stingstang

Distinguished
May 11, 2009
1,160
0
19,310
19
Not surprising. A while ago I used to call customer support to test their knowledge base. I once asked an HP guy what RAM sticks came with this one computer with 6GB of RAM. He told me it's 2 3gb sticks. Now that's news to me.
 

cyberangel_777

Distinguished
Nov 5, 2011
2
0
18,510
0
[citation][nom]mayne92[/nom]...I read both and couldn't believe it was from the same person :p[/citation]

He just illustrated her marketing skills ;-)
 

boon4376

Distinguished
Nov 29, 2011
2
0
18,510
0
Actually, If you are assuming those are high resolution ~28" monitors, and the cheap graphics card has a VGA output, and the expensive graphics card has HDMI output, there would be a very noticeable difference in clarity, color range, black levels, etc...

So not as far from the truth as one might expect... A lot of cheap computers with integrated graphics still have VGA outputs.
 
[citation][nom]mrpijey[/nom]I have an old 512KB Cirrus Logic ISA card at home. It wouldn't be able to even draw a Windows desktop today, but it still gives me as sharp image as with the most expensive videocard today... unless you got an analog screen and a bad videocable you won't see any blurry image with a modern videocard and DVI/HDMI/DP since it's a digital signal, the only thing that happens with a bad cable is signal loss.A cheaper videocard won't give you a more washed out image, only a slower rendering rate at games and most intensive desktop applications.These ads are in my experience written by idiots that couldn't tell the difference between a computer and a fridge, and they are put in a position to make an ad that guides the unknowning consumer to get the best product for their needs...It's sad really.[/citation]Although I mostly agree with you, I'd like to point out that a DVI connection from a graphics card just looks "crisper" than a VGA one on the same card. There can be a difference between two cards--but it's more directly the difference between analogue and digital.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS