News I’ve been reviewing tech for 15 years. If you buy me these crappy gadgets, include a gift receipt

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
There's a lot of garbage SSDs out there. And the strange thing is that Amazon and Newegg seem to promote the garbage brands over Samsung, WD, Seagate, Solidigm, Corsair, etc. I don't want Nextstorage SSD9000. I want the Corsair MP 600 H. Or the Samsung 980 Pro. Stop it already.
 
For those complaining about the tone of this article: I read this article as a fresh way for the author to advise us on what to buy by focusing on what NOT to buy, and ramping up the "hard to buy for" persona. A creative, clever, and somewhat entertaining alternative to the dime-a-dozen gift guides. It got me to read it!
Thank you. That is exactly the goal.
 
My pet peeve is people calling a 2560 x 1440 monitor "2K". It already has a real name QHD that describes the resolution why try and force "2k" onto it when it doesn't even make sense. Do you know what actually is a 2K monitor... 1920 x 1080. The exact thing you were saying was bad. 1920 is a of a lot closer to 2K compared to 2560. The funny thing is even Wikipedia has a list of 2K Resolutions and surprise surprise 2560 x 1440 is not on the list, but 1920 x 1080 is on that list.
If you absolutely had to use the "K" nomenclature and you were to round 2560 to the nearest 1000 it would be a 3K monitor. Round to the nearest 100 it would be a 2.6K monitor.
I've written almost this exact same post. 1000% agree. Fortunately, some people have taken to calling it 1440p, which is better and (whether they intend to or not), includes the ultra-wide variety.

BTW, a point you missed is that calling 2560x1440 "2k" it makes "4k" sound like it has double the linear resolution, but it's actually just 1.5x as much. So, it's very misleading to use "2k" nomenclature for that resolution.
 
You did see the word "ex" in there, right?
I wouldn't be so glib about someone's marriage, if you'd not made it clear that it wasn't your present situation.

My dad was a very particular person and my mom was the impatient type that really liked to cross everything off her to-do lists. One probably could've seen ominous clouds over that pairing.
 
And for me...don't buy anything on my Amazon wish lists. That stuff is there as a placeholder, for me to look at later.
Unless there is a specific list labeled "Buy This Stuff"
Last I checked, you can make multiple lists. This way, your wishlist can serve its original purpose. However, I haven't used lists on Amazon for a long time.

I'm so lazy that I just have a ton of stuff in my cart that's "saved for later". The nice thing is I get price updates on it, which I occasionally use to keep tabs on something I'm expecting/hoping to get marked down.

What I do for people who insist on giving me gifts is to tell them: food or clothing. Most of them have good taste in both.
 
Last I checked, you can make multiple lists. This way, your wishlist can serve its original purpose. However, I haven't used lists on Amazon for a long time.

I'm so lazy that I just have a ton of stuff in my cart that's "saved for later". The nice thing is I get price updates on it, which I occasionally use to keep tabs on something I'm expecting/hoping to get marked down.

What I do for people who insist on giving me gifts is to tell them: food or clothing. Most of them have good taste in both.
Yes, I have a couple dozen.

PC parts
3D printer
House stuff
BIkes
Tools
General junk
etc, etc, etc
 
Fortunately, some people have taken to calling it 1440p, which is better and (whether they intend to or not), includes the ultra-wide variety.
I much prefer the 1080p/1200p/1440p or even 2160p over any other nomenclature (QHD,HD,FHD, I strongly dislike those terms). The default assumption is always 16:9 or 16:10 (Just like the default assumption for CRT is 4:3).

I did have a ultrawide 21:9 3440x1440 for a couple months before I got rid of it. For work I think it is a great replacement for a two screen setup, but as my old game of choice does not support that resolution, and the panel was VA(never buying one of those again), I don't see myself going back to ultrawide.
 
  • Like
Reactions: bit_user
It always amuses me when people try to act elite like they can't get the job done on hardware 4X improved over what old timers used, not so long ago.

If you can't find a USB C adapter on ebay for $1 (okay, covidflation and all, call it $3), you're ineffective. Besides, what self respecting person has their system stuffed up within arm's reach to use the ports on it? Get it the heck out of the way. That's what a hub on your desk is for.

8GB is fine for the average person's use of a laptop, because it is excruciating to use a laptop for much so it's just suffering through getting work done or web surfing and if you don't have the discipline to not have two dozen tabs open, that's just a choice, like not cleaning out your car so eventually you need a Chevy Suburban to haul around all the clutter you can't manage. 😉

I agree wholeheartedly about monitors. Go big, go high res, or don't bother. On the other hand, I've had workstations where once there was a 4K large screen, there simply wasn't room for another of same size, so it was nice to take a smaller 1080p, turn it sideways, and have that only add another ~16" of horizontal space taken up because that fit within the available space. 1080x1920p is fairly usable for web pages.

Anyway, what to get a tech? Amazon Gift Certificate. If I can't spend it on tech, there are 1000 other ways to use it.
 
  • Like
Reactions: kyzarvs
It always amuses me when people try to act elite like they can't get the job done on hardware 4X improved over what old timers used, not so long ago.
I think about this, from time to time. The main problem is that software has changed and so has the web. Operating systems have evolved both to support more resource-intensive software, and also have grown to consume more of the available resources itself.

Even programming languages have evolved to impose a lot more effort on the compiler, meaning that you'll regret trying to compile a bunch of Rust or C++23 code on a 20-year-old machine.

That's what a hub on your desk is for.
Eh, I'm not a fan of hubs. Part of the problem is the botched implementation by the USB specification, which means the host OS' USB driver pretty much needs to explicitly support all of the different hub chipsets, from what I've read. If the chipset in your hub isn't well supported, then you could experience issues you don't have when directly connecting your devices.

I don't mind plugging stuff directly into my PC. For connecting my phone, I just leave the cable plugged into the back of my PC.

8GB is fine for the average person's use of a laptop, because it is excruciating to use a laptop for much
Don't assume it's not connected to an external monitor & keyboard. A lot of businesses only give their employees a laptop and not a separate desktop, but they often do provide separate monitor + keyboard + mouse (and a docking station, if you're lucky).
 
Last edited:
  • Like
Reactions: slightnitpick
8GB of RAM with Linux thrown on it is a pleasant experience. I threw it on a laptop and it floats around 1GB after booting.

So I wouldn't discount 8GB of RAM on a computer. Just 8GB of RAM on a Windows 10/11 machine.
Eh. I run Ubuntu 20.04 LTS with apparently the Compiz WM, Firefox with a couple of hundred tabs, some Ad/Tracker blockers, and the Auto Tab Discard extension for personal use (including a current Netflix binge) and Vivaldi with a couple dozen tabs for work. Between processes for the two and regular Ubuntu/Linux background processes I'm currently eating up 8.8 GB of RAM.
 
  • Like
Reactions: bit_user
Excuse me if I don't believe this is tongue-in-cheek.

Needing more than 8GB for average users is actually overkill. Unless you are specifically doing rendering, gaming, then it's more than enough.

I use a 2 core, 4 thread Kabylake mobile laptop with 7W TDP, with 8GB of RAM and 128GB SSD. Space is possibly a future concern but you conserve. Unlimited access to any resource is cancer to society.

And just like someone else said, you should be appreciative of the gifts regardless of how bad it is.
 
  • Like
Reactions: kyzarvs
And just like someone else said, you should be appreciative of the gifts regardless of how bad it is.
While the gesture of giving a gift is nice, it's not really all that great if the person receiving the gift won't use it.

You should give something someone will actually use as a gift. And if you don't know what they want, give them cash (gift cards don't count, unless you know where they actually shop).
 
  • Like
Reactions: bit_user
And this thought process goes in many other directions as well.

One of my daughters is the sous-chef at a reasonably large restaurant.

I would not consider buying her anything food related, unless she gave me the specific make/model/color/size she was looking for.
 
  • Like
Reactions: slightnitpick
There's a lot of garbage SSDs out there. And the strange thing is that Amazon and Newegg seem to promote the garbage brands over Samsung, WD, Seagate, Solidigm, Corsair, etc. I don't want Nextstorage SSD9000. I want the Corsair MP 600 H. Or the Samsung 980 Pro. Stop it already.
While I completely agree there are a ton of garbage SSD/NVM brands out there, Nextorage is considered a high end brand owned by Phison; comparable to Samsung and so forth.

My friend was the same way initially when I told him that I picked up a SK Hynix Platinum P41 2tb instead of a Samsung 980 Pro for my gaming drive. My primary OS drive is a Samsung 980 Pro 2tb and they're both comparable.
 
Status
Not open for further replies.