Build Advice New build questions ?

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
how do I get back to the UEFI screen?

the computer now boots directly to the desktop,
the only way I can think of is to remove the M.2 drive, but that seems a bit drastic, just wondering if there is some trick way to get the UEFI screen again?

also is there any way to speed up the boot up, because the memory will be good for many years, no need to test the integrity every time, that's like checking your tyre pressure every time you get in the car.

these modern computers are super reliable,

also checking integrity doesnt ensure integrity, as the memory could corrupt 1 second after the integrity check!
 

Aeacus

Titan
Ambassador
the comparison says the Asus is 300cd/m^2 versus AOC 700 cd/m^2, do you class that as "a bit" lower or is it "a lot" lower?

I presume cd means candle(s)? 300 candles in 1m^2 versus 700?
Pcpp isn't that accurate. If you go to official specs of AOC, you'll see it's brightness is actually 450 cd/m2 and not 700. Thus, 300 vs 450 is "a bit".

how does the ASUS compare with my LG 27UL500 which says 240 cd/m^2, this LG is totally useless, with blurry images.
Brightness has 0 to do with blurriness. So, can't compare brightness to response time.

What is to do with blurriness, is response time. Your LG has it at 5ms while AOC has it 2,7ms and Asus at 1ms. That's quite a difference since when response time is 2ms or more, i can tell a diff (ghosting and blurriness).

that's a new trick! but what I was requesting was how to extract an excerpt from an existing image.
I do image editing as my hobby, so, i know the helpful commands. Of course, i don't use M$ Paint on daily basis but i know thing or two (i have other image editing programs i use).
M$ Paint has crop tool. Video tutorial:

View: https://www.youtube.com/watch?v=u8J3JAXKEs4


how do I get back to the UEFI screen?

the computer now boots directly to the desktop,
the only way I can think of is to remove the M.2 drive, but that seems a bit drastic, just wondering if there is some trick way to get the UEFI screen again?
After POST, press Del.

Or easier: once you reboot and screen goes black, start pressing Del continuously, until UEFI comes up.

also is there any way to speed up the boot up, because the memory will be good for many years, no need to test the integrity every time, that's like checking your tyre pressure every time you get in the car.
Look it from UEFI Help menu. Fast Boot is very much a thing, that you may need to enable. Also, system shouldn't do memory training more than 2 times after hardware assembly.

Ever noticed that in supermarkets and also in elevators, there is background music playing? Usually some ambient tune, but can be plain old radio channel as well.

Can you tell me why supermarkets think it is good idea to play the background music? Since after all, installing speakers and keeping the background music going still cost something and if it doesn't help in regards of income, why do it?
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
first a progress report, I eventually managed to get the UEFI screen again by pressing the correct key repeatedly.

now to troubleshoot the fan noise, I experimented with the CPU fan to stop that. I did a visual confirm, the blades not turning.

yet the clicking sound continued!

I then removed the inner parallel fan, and it too had halted, and the clicking continued! so the Be Quiet! fan was Being Quiet!

all the Noctuas halted, and the clicking continued.

only one other fan I could discern, the PSU fan. and the noise seemed to be coming from it.

removed the tower case PSU cover plate, then detached the PSU, powered up the machine, and the PSU fan was spinning, but when I scanned the PSU with my right ear, no problem. where the clicking continued definitely from the mobo away from the PSU.

I then placed the tower case in various directions, powering on, and scanning with my ear, the clicking seemed to be coming from the back panel of the mobo. maybe some fan in there, or maybe a busted capacitor.

scanned variously, and it seemed to be from there.

rescanned, to find the centre of the noise, and it in fact seemed to be from the Noctua back fan, which wasnt spinning. pressed against the case grille near the centre of the fan, and the clicking stopped! it was the fan.

so I now removed the power cable of that fan, and now quite a bit of silence. but eventually a clicking, this was from the middle fan at the top, which wasnt spinning. removed the power cable of that.

now even more silence, but eventually clicking, this time from the back fan at the top, removed the power of that.

and now writing this on the new PC, total silence! where I can work with this in the room.

I think possibly some of those Noctuas are defective. I'll email them to see what they say.

some of the Noctuas make a clicking sound when they arent spinning.

this is one problem if you always have music on, because then you wont have experience of the noise of the equipment if any. as the noise will be drowned out by the music.

eg I bought a Packard Bell monitor once, in the shop great, then when I set it up, this continual transformer hum, where I took it back for a refund, saying I couldnt work with such a monitor and all the other monitors I had used over the years eg Philips, Acer, Samsung total silence. in the shop, the ambient sound of the shop and mall meant I didnt notice the sound. but my PC is in a different room via cables through the wall, and total silence, where the monitor transformer noise was the elephant in the room!

a lot of people buying that monitor might be working in an office, where the ambient office noise will make them not realise the monitor is noisy.

monitors should be seen not heard!

note with the above troubleshooting, I always powered off the machine before detaching power cables, although I did remove the inner CPU fan powered on, but didnt detach the power.

Pcpp isn't that accurate. If you go to official specs of AOC, you'll see it's brightness is actually 450 cd/m2 and not 700. Thus, 300 vs 450 is "a bit".
ok, that old problem of the manufacturer giving unreliable specs!

this was a major problem in the era of analogue hifi systems, where every component produced a different quality of sound. eg TDK cassettes produce a different sound from Sony.

often the headphones or loudspeakers which said say 35Hz to 16000Hz were far better sound than those which said 20Hz to 20000Hz.

where the 35Hz to 16000Hz was honest and tested, whereas the 20Hz to 20000Hz was either just guessed, where the truth might be 100Hz to 12000Hz, or included background hiss, or was like the WD Blue saying it is 6 Gbps, where 20Hz to 20000Hz is the official hearing range of humans. in fact each person is different. dogs hear much higher frequencies, hence the use of dog whistles which only dogs can hear. its also why dogs voices tend to be very high eg whining.


with analogue hifi the remedy to this problem was you should only evaluate hifi components by directly listening, specs are meaningless, and top end stores would have listening rooms, eg set up an amp, turntable, loudspeakers, then just try different amps, and see what each sounds like. or just try different turntables. there is ginormous variance, and some equipment is just a much richer and wider gamut of sound, and less noise.

when I taught myself some basics of electric guitar, in a music shop I asked the assistant to demonstrate some guitar amps. he connected a guitar to one amp, and it sounded great. then he reconnected it to another amp, and this one sounded sublime, like some rock legend playing in the room! where I thought "wow!".

I asked: how come that one sounds so much better?

he said: the first one is solid state, its all based on transistors. but the second amp is a valve amp. he said the disadvantage of valve amps is they can only be used from the mains, and they take time to warm up. my grandfather had a valve radio, huge machine, and it took time to warm up, but once warmed up, the sound was magic, which no transistor based radio can make.

the difference between the valve amp and the transistor amp was like comparing a top end Mercedes with a low end Trabant.

analogue technology is in fact the ultimate technology as it minimises distortion and is often totally parallel, but it is much more complicated to engineer. digital is much easier to engineer, as it can be engineered in software. an example of analogue technology is a mirror which is ultra parallel technology.


Dolby was used to reduce the innate noise of cassettes. CrO2 enabled a stronger signal without distorting than iron oxide, enabling the ratio of noise to be reduced. (signal to noise ratio). metal cassettes even stronger signal, but the catch with metal, was a metal cassette was more expensive than a vinyl record.

hifi isnt meant to sound nice, just to sound accurate. hifi means "high fidelity" which means high fidelity to the original sound, where you record some sounds with some mikes, and when you play back it sounds identical to the original sound. if it sounds better, that isnt hifi!

true hifi enthusiasts dont use graphic equalisers, because each bit of electronics adds distortion, where a graphic equaliser will add a lot of subtle distortion. some top end amplifiers just had a power on switch and volume, they didnt even have bass and treble controls, as these distort the sound.

their concept was you cant improve perfection, only deteriorate it!


top end hifi uses reel to reel tape, but this is beyond the scope of most people to utilise properly, firstly really expensive equipment, secondly really expensive tapes, thirdly requires a lot of storage... cassettes have about 1mm of tape width per track, and with CrO2 this is higher quality than CDs. imagine if you had 10mm per track, it would be sublime quality. they deliberately made quality stuff beyond the reach of the public to prevent piracy! similarly with 35mm film, they could easily have made 70mm film for the general public but they didnt. early era film used huge glass plates, and the resolution and flatness are sublime. 35mm used a plastic reel which wasnt perfectly flat, and was small. it was good, but not for commercial quality images.

Brightness has 0 to do with blurriness. So, can't compare brightness to response time.

What is to do with blurriness, is response time. Your LG has it at 5ms while AOC has it 2,7ms and Asus at 1ms. That's quite a difference since when response time is 2ms or more, i can tell a diff (ghosting and blurriness).
this is the true response time rather than the numbers alleged by the manufacturer?

the Asus is thus better in this respect?


I do image editing as my hobby,

its the best way to do things, as a hobby!

once you do things professionally, it becomes a chore, and matter of fact.

nowadays everything I do is as a hobby.


so, i know the helpful commands. Of course, i don't use M$ Paint on daily basis but i know thing or two (i have other image editing programs i use).
M$ Paint has crop tool. Video tutorial:

View: https://www.youtube.com/watch?v=u8J3JAXKEs4
the point at which I realise I dont have sound on the new PC! I managed to find some cheap USB headphones which I bought for when I gave feedback via remote access to my desktop about a currency exchange website.

I cannot believe I never saw that "crop" button in the program, eg on this earlier screenshot:

http://www.directemails.info/tom/mobo/where.jpg

in the top left zone, to the right of "Select". I think you see what you want to see, or what you expect to see, not what is actually there.

I think I expected it either in a menu or on the right or at the top, so I didnt even look anywhere else even though I thought I did!


After POST, press Del.

Or easier: once you reboot and screen goes black, start pressing Del continuously, until UEFI comes up.
ok, I have it functioning now, which enabled me to progress the troubleshooting of the clicking sound. I repeatedly misattributed the noise, I never thought a non moving fan would make a clicking sound!

writing this post, the entire time total silence now I have removed the power of 3 of the Noctuas, the one at the back, and at the top the middle and back ones.

time to reattach the door and side panel!

but maybe I will reinstate the optical drive first from the 2010 PC.

I am pondering getting an external Quad layer compatible Pioneer drive, one advantage is I'd be able to use it from my laptop as well.

can I view the fan info from the MSI desktop?

or only from the UEFI startup interface?


Look it from UEFI Help menu. Fast Boot is very much a thing, that you may need to enable. Also, system shouldn't do memory training more than 2 times after hardware assembly.

Ever noticed that in supermarkets and also in elevators, there is background music playing? Usually some ambient tune, but can be plain old radio channel as well.
that music is called "muzak"! (pronounced myoozak)

Can you tell me why supermarkets think it is good idea to play the background music? Since after all, installing speakers and keeping the background music going still cost something and if it doesn't help in regards of income, why do it?

they are engineering your mood!

the music is a kind of "I have all the time in the world and nothing to do" music!

the ongoing announcements from the ceiling are also some kind of hypnotism or mind control. when you are a kid, the adults are the voice of authority and the voice comes from above, so when the supermarket voice comes from above you interpret that as authority.

with Lidl in the UK, their announcements are just like airport announcements! I think that is deliberate, and they use airport jargon, eg they will say "terminal 3 is closing". where "terminal" is airport jargon for a wing of the airport, eg with London Heathrow, you have terminals 1, 2, 3, 4, .... not sure how many they have today. Heathrow is the biggest international airport in the world, but the biggest airport is some US airport but which isnt international. with Heathrow, they have planes landing literally every 20 seconds, where when a plane takes off, you will see another plane further ahead in the sky, and another ....

and the planes queue up in a spiral above London waiting to land!

another trick Lidl do is they put their prices above the items, eg:

expensive price
luxury item
cheap price
cheap item

people are used to prices below items, so people think the luxury item is super cheap, and they dont scrutinise their receipts!

it is a misleading representation of prices.



 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
I have the machine connected to the monitor of the 2010 PC now, and have moved the main data disk SSD to it.

also I connected that LG monitor to the loudspeakers, and reconfigured the HDMI to forward the sound.

the graphics is a bit slow, reminds me of the mobo graphics of my 2004 HP Pavilion before I got a graphics card, so I may install that soon as its a bit of a nightmare. also the handshake mouse is slower than with my 2010 PC!

this could be because the mobo graphics is taking too much bandwidth.

I got by default 3840 x 2160 graphics initially, and it was way too slow, when I tried reconfiguring to 1920 x 1080, the width was shrunken. eventually changed the refresh frequency and this somehow caused 1920 x 1080 to fill the screen.

the graphics on the 2007 passive 3D LG monitor is pixel perfect and really precise, whereas this more recent LG has imprecise pixels, as if someone has blurred the pixels

so overall its worse than my 2010 PC! I just hope that installing the graphics card will give the same boost that the graphics card gave for my 2004 HP Pavilion PC.

the bluetooth keyboard also is problematic with this machine, where it freezes up sometimes, and there is general latency of handshake mouse and bluetooth keyboard, where same mouse no latency with the 2010 PC.

I have got some of my old software running, one software item did have a 64 bit version, and the other software the 32 bit version functions fine. Not yet gotten the WinUAE Amiga emulator functioning the way it did with 32 bit Windows 10 on the 2010 PC, where it becomes full screen as if the PC is an Amiga with today's hardware.
 

Aeacus

Titan
Ambassador
I think possibly some of those Noctuas are defective. I'll email them to see what they say.

some of the Noctuas make a clicking sound when they arent spinning.
First time i hear fans doing that. :mouais:

Sure, contact Noctua and ask them what is going on.
My only guess, when the fan doesn't spin, you've set MoBo to still feed some voltage to the fans, whereby motor tries to spin but due to the lack of enough voltage, it only clicks.

E.g in similar sense when you have wall clock that operates from battery (e.g few AA batteries) and it also has a hand for seconds. Once battery charge drops too low, the hand for seconds just jitters in place, without having enough power to move forwards.

this is one problem if you always have music on, because then you wont have experience of the noise of the equipment if any. as the noise will be drowned out by the music.
Well, you don't have to have background music playing. You just need to increase white noise amount to mask other, smaller noises from the PC. Or you can even play some ambient music. E.g i've heard that listening ocean waves or crickets in the woods (natural sounds) is relaxing. I, personally, don't like ambient sounds as background music. I'd rather have some good trance to listen to. But to each their own.

this is the true response time rather than the numbers alleged by the manufacturer?
At today's world, only in-depth review can say what the true values/numbers for a hardware is. Everything else is from official specs. How much you believe them is up to you.

These response time numbers are from official specs. Thus far, i've seen that most monitor spec values have been accurate. Except HDR certification. That one is usually exaggerated.

the Asus is thus better in this respect?
The shorter the response time - the better. So, yes, Asus is better.

can I view the fan info from the MSI desktop?
There are 3rd party programs to view fans.

The best telemetry program is HWinfo64, in Sensors mode,
link: https://www.hwinfo.com/download/

It shows you plethora of information, essentially everything that can be shown.
It also has logging feature, where you can record the telemetry and compare it later. E.g when benchmarking CPU/GPU. Then after benchmark, you can open up the recorded logs and look how your system fared at any given point.

this could be because the mobo graphics is taking too much bandwidth.
Your X670E chipset MoBo doesn't have any graphics chip on it. The graphics chip is inside the CPU. So, when CPU has to work harder, it may also impact graphical smoothness, especially when CPU has to output 4K graphics (3840x2160). And iGPU inside CPU isn't that powerful. Better to run 1080p reso with iGPU. But for 2K and 4K reso, for that there is dedicated GPU, which has the power to easily output higher resolutions.
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
I am thinking now of getting an external USB3 bluray writer, the one I am considering does quad layer, which I think is 128Gig per disk.

I am considering this one:

https://www.amazon.co.uk/Pioneer-External-BDR-X13EBK-Reliability-PureRead/dp/B0C367V1QW/

ordinary blurays are 25Gig/disk, and BD-R disks cost £0.885 per disk including postage for 1-6x TDK ones. this is very cheap storage for things not often looked at eg backups of hardware installer CD isos, eg 250Gig is then £8.85. 1 terabyte is some £35

now my argument for going for external versus internal, is:

1. I find that SATA optical drives slow down the boot process, as there is inertia with optical disk drives. also I think they slow down the boot down process.

2. the SATA optical drive will be permanently on, these dont have power switches on the drive, just an eject button. so they will be gradually wearing out and drawing unnecessary power.

3. the optical drive eats up a drive letter.

by using external, I only need to power up the drive when I plan to use it, there are many months where I dont use optical disks for the entire month. Main recent uses are to create the Windows 10 and 11 installer disks. and to install Windows 10 + 11 from, and run without installing Linux Mint.

thus I think much more efficient to use external optical drives. AND the added advantage of being able to use it from my 2023 laptop, to take the load off the other PC.

An argument for installing two M.2 drives to M2_2 and M2_3 is that presumably these slots become obscured by a graphics card?

whereas M2_1 looks directly accessible?

I am thinking of getting a 2nd M.2 drive, to test how fast it is to backup the one drive verbatim to the other. I want to do this before some riskier installs in case the existing install gets corrupted.

I found the extra MSI software only appears if I agree to the privacy statement, here are screenshots of what these supply:

http://www.directemails.info/tom/mobo/further_installs1.jpg
http://www.directemails.info/tom/mobo/further_installs2.jpg

where I am rejecting anything to do with AI, as I want to be in control of my machine and not rely on some pattern based guesswork!

I also dont like predictive text, as I like to know how to spell words, where I know the words I am unsure of, usually relating to double letters, eg embarrassing, is that double r? double s?

where I might mis-spell these but only because its not worth looking them up.

I also dont use grammar checkers, as I like to understand grammar, and also grammar checkers enforce american grammar which is different from and less sophisticated than british grammar. eg americans will say "different than", but in british english "than" is only used for comparatives, so we say "different from". but we say "bigger than", we only use "than" for a "transitive" relation ~ where A~B~C => A~C

"different" isnt transitive as 1~2~1 but 1 not~ 1.

americans will also misuse double letters, eg they write focused, instead of british focussed, the american spelling is logically incorrect as a single s followed by e means the u is pronounced yoo, eg used, abused, fused, recused, whereas double s prevents modification of the u sound, eg fussed, bussed, trussed. the e modifies the earlier u or a or o

the american spelling of focused would be pronounced fokyoozd.

british english is basically more nuanced and subtle and precise than american english. this relates to history that american english derives from farm settlers, whereas standard british english was the language of the elite people organising the settlement colony. british english is like parisian french and moscow russian and berlin german.

this new win10 install is implementing some corrective text, I want to disable this. I find corrective text interferes with my thought process. the only place predictive text is useful for me is with Google search, as it helps me find correct spellings if I am unsure and shows me better search expressions.

there is a problem with say a 3840 x 2160 screen, that eg with Photoshop, the interface gadgets and text are microscopic eg this screenshot:

http://www.directemails.info/tom/mobo/microscopic_fonts.jpg

I think this monitor must be interpolating pixels with its hardware pixels, which is the probable cause of the blur. is there a way to get the resolution which gives the pixels of the monitor, ie pixel perfect graphics?

eg my 2007 LG gives pixel perfect graphics which is vastly better than this much more recent LG.


First time i hear fans doing that. :mouais:

Sure, contact Noctua and ask them what is going on.
My only guess, when the fan doesn't spin, you've set MoBo to still feed some voltage to the fans, whereby motor tries to spin but due to the lack of enough voltage, it only clicks.
I may need to lower the non spinning voltage then, will reconnect the worst fan and experiment with that. really the electronics ought to mitigate this problem. this is where its not a polished design and its why you need betatesting. most software is betatested, most hardware isnt.


E.g in similar sense when you have wall clock that operates from battery (e.g few AA batteries) and it also has a hand for seconds. Once battery charge drops too low, the hand for seconds just jitters in place, without having enough power to move forwards.
I havent seen this phenomenon! probably because I dont use wall clocks, as the one I tried was way too noisy! and other ones I have seen are mains powered.

Well, you don't have to have background music playing. You just need to increase white noise amount to mask other, smaller noises from the PC. Or you can even play some ambient music. E.g i've heard that listening ocean waves or crickets in the woods (natural sounds) is relaxing.

natural sounds yes, as we have evolved for more than 250 million years for those, but computer simulated ones are dubious, as they are at some level fake. there is some part of the brain which rejects fake stuff, this is why we forget dreams within a few minutes, as the brain rejects the dreams as being fake. you will only remember a dream if you write it down very soon after waking up.

in the summer you will feel better for various reasons, one is that sound travels further.

once at a railstation waiting room, there was a guy with a dog. the dog was whining unhappily continually, but I noticed that every time someone entered the waiting room it stopped whining. I realised it might be because of the sound, so I jammed the door open, and the dog stopped whining! the dog basically needs the sound of the outdoors, otherwise its like a human without light!

there is a compelling argument that humans evolved along seashores, and not as hunters.

this is why when we see a sandy beach (some beaches eg Brighton are pebbly), we feel we are in paradise, its because this is where humans originally evolved.

eg human feet are perfect for sandy beaches, but useless for inland terrain, whereas inland mammals have no problem.

it was only when all the seashores were used up, that humans settled along rivers and lakes, and only when the rivers and lakes were used up, that the remaining humans were forced inland to become hunters.

fossil remains of the seaside humans tend to have vanished, as they either cremated the dead or threw them in the ocean. so the fossil remains tend to be of the inland primitive hunters.

the most primitive traditional societies are hunters, the most advanced ones are fishermen.

human teeth are small and weak, this is an adaptation to eating seafood which is very easy to eat, smaller teeth enable a bigger brain.

humans cannot eat a mammal without tools, but we can eat seafood without tools. seafood is also the most nutritious for humans.

human babies can swim right from birth, but cannot walk for ages. whereas most mammals can walk within a few minutes of birth.

the way women rock babies mimics the ocean waves.

humans dont have fur except on the scalp and beards for men, fur enables survival in very hot and very cold climates, eg polar bears.
similarly feathers. but humans cannot handle the climate of northern europe without clothes even after thousands of years!

at Bristol zoo, the gorillas will be outdoors in the winter naked without problem! because they have fur. (that zoo has been relocated).

humans would swim with their heads above the water, where the fur shielded from the sun.

also most mammals cannot control their breath, its only waterside and water based mammals which can, eg elephants, whales, dolphins,

I think gorillas and chimpanzees cannot control their breath and are afraid of water.

the upright form is also a waterside adaptation, eg penguins, bears, otters also are upright.

human biology also is wasteful of water, eg humans have a very high amount of sweat glands, whereas say a dog has very few,

its because living by water, you dont need to conserve water.

inland animals sweat much less in order to conserve water.

humans would work the coast in pairs, the one person would wade in the water, and throw caught things to the guy on the shore, this is why humans like throwing and catching objects at short range!


I, personally, don't like ambient sounds as background music. I'd rather have some good trance to listen to. But to each their own.
silence is the best background for study, eg a library!

At today's world, only in-depth review can say what the true values/numbers for a hardware is. Everything else is from official specs. How much you believe them is up to you.

you need 3rd party reviews. in Germany, they have an organisation TuV which independently tests EVERYTHING objectively. if their rating is "sehr gut" that means "very good" in german. there is "conflict of interest" if a firm reviews their own stuff! nobody can be objective about themself, only about others. when you evaluate yourself that is "subjective", namely "subject verb object", when you evaluate others that is "objective".

in Britain they are bringing in the TuV idea for some things, eg restaurants and takeaways get externally rated for hygiene. whether the food tastes good is subjective, but the hygiene is objective. things like schools get externally rated by Ofsted.

with cars, you have to get an MOT every year, which tests if the car is roadworthy. fixing any deficiencies before passing the car. eg the wiper fluid wasnt working, which would have failed the MOT.

These response time numbers are from official specs. Thus far, i've seen that most monitor spec values have been accurate. Except HDR certification. That one is usually exaggerated.


The shorter the response time - the better. So, yes, Asus is better.
I may go for that one, can you verify it isnt outdone by any available ones?


There are 3rd party programs to view fans.

The best telemetry program is HWinfo64, in Sensors mode,
link: https://www.hwinfo.com/download/

if I can figure out how to download it!

it says it is downloading but it doesnt appear in the downloads list of Microsoft Edge.


It shows you plethora of information, essentially everything that can be shown.
It also has logging feature, where you can record the telemetry and compare it later. E.g when benchmarking CPU/GPU. Then after benchmark, you can open up the recorded logs and look how your system fared at any given point.


Your X670E chipset MoBo doesn't have any graphics chip on it. The graphics chip is inside the CPU. So, when CPU has to work harder, it may also impact graphical smoothness, especially when CPU has to output 4K graphics (3840x2160). And iGPU inside CPU isn't that powerful. Better to run 1080p reso with iGPU. But for 2K and 4K reso, for that there is dedicated GPU, which has the power to easily output higher resolutions.

there is something problematic, same experience as my 2004 HP pavilion before I got the graphics card, where when you move a window, it billows like a flag.

it makes me wonder whether the thing to do is get a low end CPU with a top end graphics card! eg get a celeron and an RTX 4090
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
First time i hear fans doing that. :mouais:

Sure, contact Noctua and ask them what is going on.
My only guess, when the fan doesn't spin, you've set MoBo to still feed some voltage to the fans, whereby motor tries to spin but due to the lack of enough voltage, it only clicks.
I have now reconnected all 7 fans to the mobo, and put the voltage as 0 up to 65°C, then rapid transition to 12V at 75°C, and the clicking halted as soon as I reached the worst fan at the back. so have fixed that problem!

I left the CPU fan unchanged, and I think the PSU's fan doesnt appear on the UEFI

now to another fan related technical question,

for each fan, I can associated it with one of the following:

CPU core
system
MOS
chipset A
chipset B
T_SEN1
T_SEN2

the Be Quiet! I think is CPU Core, and the others all are currently also CPU Core,

the fan allocations are:
PUMP_FAN1 = top back fan
SYS_FAN1= top mid fan
SYS_FAN2= top front fan
SYS_FAN3= front upper fan
SYS_FAN4= back fan
SYS_FAN5= base fan
PUMP_FAN2= front lower fan

which zone should I allocate to each fan?

no idea where MOS, Chipset A, chipset B are, and not sure what System refers to other than "everything"

another technical question, for the UEFI options, I need a wired mouse and keyboard, so I use the 2007 monitor in the other room,

is there a way to split the video output USBC socket, to switch between the 2 monitors, or even to display both at the same time, both via USBC to HDMI currently?

because currently I have to disconnect the USBC to HDMI of the one monitor and attach the USB C to HDMI of the other monitor,

I tried using another USB C socket for the 2007 monitor but it didnt work.

I am going to do a compressed sector backup now of the M.2 drive via Linux Mint next, so I will have to visit the forum via another of my 3 PCs (2023 and 2010, but not the 2024).
 

Aeacus

Titan
Ambassador
3. the optical drive eats up a drive letter.
With entire alphabet to use as drive letters, this is hardly an issue. Also, if you connect external drive, system automatically allocates drive letter to it. So, drive letter will be used up regardless.

An argument for installing two M.2 drives to M2_2 and M2_3 is that presumably these slots become obscured by a graphics card?

whereas M2_1 looks directly accessible?
Yes.

I found the extra MSI software only appears if I agree to the privacy statement, here are screenshots of what these supply:

http://www.directemails.info/tom/mobo/further_installs1.jpg
http://www.directemails.info/tom/mobo/further_installs2.jpg

where I am rejecting anything to do with AI, as I want to be in control of my machine and not rely on some pattern based guesswork!
None of them are needed for PC's normal operation.

Here's the description of two that are missing from installation page;

Super Charger:
Super Charger provides i-Pad, i-Phone and iPod charging function.
The iPad with very special charging requirements as it requires 1.6A power supply rather than the 0.5A current available with conventional USB interfaces. That is ordinary computer cannot charge your iPad even at power on status. The MSI Super Charger is a Windows resident program capable of revising power supply mode of your USB port. Once an iPad is connected to your USB port the Super Charger sends a signal to initiate its charging circuit.

Devices Speed Up:
Devices Speed Up allows you to improve USB data transfer speed and speed up data storage performance.

USB BOOST - supports faster data transfer rates of the USB storage devices.
STORAGE BOOST - supports faster access speed of storage device.

Source: MSI Center User Guide.

this new win10 install is implementing some corrective text, I want to disable this.
Guide here: https://www.majorgeeks.com/content/page/turn_autocorrect_on_or_off.html

there is a problem with say a 3840 x 2160 screen, that eg with Photoshop, the interface gadgets and text are microscopic eg this screenshot:
Might need to upscale the icons.
Fix here, under "Workaround": https://support.microsoft.com/en-gb...-devices-508483cd-7c59-0d08-12b0-960b99aa347d

Display settings are found: Settings - System - Display.

silence is the best background for study, eg a library!
Like i said, to each their own.

Just because you like to sit in complete silence, doesn't mean others have to or are required to.

When one gets used to sitting in silence, so one can focus, it can come and bite one in the arse. Namely, outside world is rarely complete silence, whereby one can't focus anymore since there are constant sounds around them.
But when one can focus even when there are sound around them, then one can do their task far better, without loosing focus, just because there are sounds.

I may go for that one, can you verify it isnt outdone by any available ones?
For sure there are better monitors out there than the AOC or Asus. But since your criteria is so strict regarding monitors, there isn't much to choose from. And if you factor in availability as well, then you are left with few choices only.

If you want the best, then currently the best 2K, 27" monitor, with flat screen, is;
MSI MPG 271QRX QD-OLED,
specs: https://us.msi.com/Monitor/MPG-271QRX-QD-OLED
review 1: https://www.tweaktown.com/reviews/10681/msi-mpg-271qrx-qd-oled-1440p-360hz-gaming-monitor/index.html
review 2: https://www.pcgamer.com/hardware/gaming-monitors/msi-mpg-271qrx-review/
store: https://www.overclockers.co.uk/msi-...03ms-a-sync-gaming-monitor-mon-msi-01490.html

With this, Asus monitor is outdone by MSI monitor on so many levels, whereby it would be pointless to compare Asus monitor with MSI (like comparing Trabant with Koenigsegg).

if I can figure out how to download it!

it says it is downloading but it doesnt appear in the downloads list of Microsoft Edge.
Your PC, need to figure it out.

I don't use M$ Edge at all. Filled with security holes like Swiss cheese and Micro$oft doesn't need to know what i browse. So, i instead use Mozilla Firefox, with plenty of security add-ons. I do have Google Chrome as well, for backup. And Tor too, just in case.

and I think the PSU's fan doesnt appear on the UEFI
For sure it doesn't appear since there is no data cable between PSU and MoBo. Only very few digital PSUs have this option. Corsair AXi is once such unit.

no idea where MOS, Chipset A, chipset B are, and not sure what System refers to other than "everything"
MOS = MOSFET, part of the MoBo VRM. So, essentially VRM temperature.

Chipset - near bottom right corner of MoBo. Since it takes up essentially 1/4 of MoBo, it very well may have 2 temp probes to cover it.

System - could be temp probe for ambient temperature inside the PC case. Where that temp probe is exactly located - i don't know, since it differs from MoBo to MoBo. Bottom left corner is one option.

which zone should I allocate to each fan?
They all can remain under CPU, so that when CPU gets hot, all case fans adjust their speed accordingly.

But if you want to fine-tune it, by allocating each fan to the closest temp probe, then it should be along the lines of:
MOS - rear exhaust fan + 3x top exhaust fans
chipset - bottom intake fan and 2x front intake fans

Do note that reading from MoBo temp probes doesn't fluctuate around as much as CPU temp does. So, you very well might end up in constant fan RPM, without change. E.g my chipset temperature usually stays at 32C.

is there a way to split the video output USBC socket, to switch between the 2 monitors, or even to display both at the same time, both via USBC to HDMI currently?
KVM splitter is such hardware that can mirror the video signal. It works in reverse to KVM switch.
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
I setup Linux Mint to backup the M2 sectors to a USB3 encased WD Blue 2T drive.

the drive didnt appear! I reattached it directly to a mobo USB socket rather than via a hub socket, and now it did appear.

began a compressed sector by sector backup,

this was progressing, and eventually all things relating to drives froze up.

either the encasing was kaput or the drive was kaput.

its a 3.5" drive, so with much effort I connected the drive directly via SATA,

this needed an extra SATA power socket, as the existing daisy chain wont reach the front of the HDD cage. so had to remove the PSU cover panel, then the current SATA power socket on the PSU obstructed the further one, so rearranged, where the furthest 3x2 sockets used first, where now 2 in use for SATA. then decided to attach mobo sata cables directly to the SATA drives rather than via extenders.

1 of the 4 supplied cables has a 90° plug at one end. was trying to see what the idea of this was, its no use for the mobo end, as it arrives awkwardly at the lower sockets, and its from the wrong direction for the upper socket. at the drive end, it is a complete waste of time for the side bracket, as it cannot arrive at that direction.

then for the HDD cage, the sockets are at the front, and as the clip is above, it arrives from below the drive, but this obstructs the drive below, and if its the lowest drive, its at an awkward angle!

conclusion: 90° SATA sockets are a dubious idea, no idea why they supplied one with the mobo!

anyway, I installed all 4, and used the 90° one at the WD Blue drive.
wear the worst one out first.

I have reinstated the upper HDD cage, first attempt it didnt go back in fully, and I found it was pinching the SATA power cable to the side bracket SSD! retried, checking the back of the HDD cage, and got that reinstated, and no problem from the HDD cage being a bit rickety.

now reloaded Linux Mint, and the drive appeared on the drive hardware software eg GParted, but not on the desktop which requires "mounting" a drive. eventually by loading the file explorer, the drive got mounted by exploring it.

began the sector backup, and it appeared to freeze at eg some 4.2Gig, but I found that wasnt being updated, and in fact it all succeeded!

this is the shell commands and ascii output for the backup:

$ date ; sudo dd if=/dev/nvme0n1 | gzip -c >> /media/mint/sectors/2024_04_09_1639_M2_win10.gz ; date Tue Apr 9 16:40:06 UTC 2024 3907029168+0 records in 3907029168+0 records out 2000398934016 bytes (2.0 TB, 1.8 TiB) copied, 7515.86 s, 266 MB/s Tue Apr 9 18:45:22 UTC 2024 $

where it took 2 hours 5 minutes to backup 281 Gigabytes of drive used of the M.2 drive. the date commands enable me to know how long it took.

Mint says 266 MB/s transfer speed, from M.2 to WD Blue SATA 3.

anyway, it seems the USB2 encasement is kaput, unless the 2T is beyond the limit of USB2?

where maybe it works until the drive is filled beyond its limit?

I had a siesta whilst the drive was copying, and at some point I was dreaming of someone laughing, woke up and it was the optical drive chugging! where it was a kind of HA HA HA kind of intermittent laughter.

at one point I woke up dreaming of being by the seashore, with palm trees, with a swift breeze, and it was the Noctua fans powering up!

I managed to get the handshake mouse and wireless keyboard both functioning on Mint via the bluetooth dongles. you have to enter a PIN on the keyboard to complete the pairing.

With entire alphabet to use as drive letters, this is hardly an issue.
wrong!

maybe for you, but even with my 2004 PC I ran out of drive letters, and had to keep some drives inaccessible as not enough letters, where I had to rearrange letters to access those!

I have a lot of drives, and some have several partitions, and then say a floppy drive, optical drive, and until recently I had a camera memory card reader, which ate up something like 4 letters! one for each format.

right now where I have barely installed the system, I have already used up 7 letters. 27% of letters used up!

with more involved projects, I will backup to maybe 4 or 5 drives simultaneously via scripts, and one of the archiving drives has 4 partitions, so just that would take the count to 11, which is 42%.

with that drive, one partition is for digitising my vinyl collection, another is for CD isos, eg hardware installer CDs and DVDs, eg for my printer,
etc.

Also, if you connect external drive, system automatically allocates drive letter to it. So, drive letter will be used up regardless.
yes, but only temporarily when I use the external drive, not permanently use up a drive letter, which a lot of the time isnt used. I then would have to detach one of the 2 SATA plugs to remove it. too much hassle.

I want the optical drive letter to be used up on demand, and not permanently.

I like to do things on demand, rather than by default.

eg just switch on the lights or heating in the rooms being used, rather than switching on all the lights and heating!

they ought to put power switches on hardware, so you can opt out the hardware easily without having to unplug it.

looks like I can remove the drive letter of the optical drive, but the drive is there using up power and gradually wearing out.


I need to make some permanent decisions before installing the graphics card.

None of them are needed for PC's normal operation.

Here's the description of two that are missing from installation page;

Super Charger:

that one is a complete waste of time for me! as I never use Apple products, overpriced and in my opinion worse than Windows for the computers, and worse than Samsung for the smartphones.

Devices Speed Up:
Devices Speed Up allows you to improve USB data transfer speed and speed up data storage performance.

USB BOOST - supports faster data transfer rates of the USB storage devices.
STORAGE BOOST - supports faster access speed of storage device.

I would have hoped the hardware did this automatically!

is the hardware deliberately slowing things down unless you select the right options?

I dont get what they are saying, doesnt a drive have its own speeds, and then the hub have its, and the bus its speeds, I dont see what they can add?


ok, have done that and so far it seems to be working!

much better without the meddlesome autocorrect. I think autocorrect will deteriorate the intellect as your brain is supposed to do the autocorrecting.

Might need to upscale the icons.
Fix here, under "Workaround": https://support.microsoft.com/en-gb...-devices-508483cd-7c59-0d08-12b0-960b99aa347d

Display settings are found: Settings - System - Display.
this has worked, at 200% scaling, the fonts are now readable, but seem to always be smaller than the Desktop and browser fonts.

which looks like this:

http://www.directemails.info/tom/mobo/reasonable_fonts.jpg

versus the original:

http://www.directemails.info/tom/mobo/microscopic_fonts.jpg
which is no good on the HDMI socket of the newer LG.


at 300% the fonts are now nice, but the system ones too big,

I can temporarily rejig the system ones to 300% to get the photoshop ones a nice size.

this is what it looks like at 300%:

http://www.directemails.info/tom/mobo/better_fonts.jpg

I have moved the newer LG monitor to the same room as the new PC in order to be able to use the wired keyboard and mouse, and not have to keep changing monitor cables.

now this time I connected it with DP for the first time, until now it was via HDMI, and the screen image is VASTLY better with DP than with HDMI, where it is really sharp, where I can handle a smaller font eg the 200% image above is just about tolerable.

Like i said, to each their own.

Just because you like to sit in complete silence, doesn't mean others have to or are required to.

When one gets used to sitting in silence, so one can focus, it can come and bite one in the arse. Namely, outside world is rarely complete silence, whereby one can't focus anymore since there are constant sounds around them.
But when one can focus even when there are sound around them, then one can do their task far better, without loosing focus, just because there are sounds.
if other sounds are getting through then that can be a major problem, eg some workers working outdoors, eg hurling scaffolding or worst of all hammering!

you are lucky you werent born in the 1700s as there would be no music you could put on for your work!

in a war, a lot of young people today will be unable to cope, because the US will shut down most of the websites and GPS satellites, there wont be mains to recharge the smartphones as the russians will hit the power stations with hypersonic missiles, so no smartphones,

overdependence on technology, where the young people are unable to even tie their shoelaces or swear without an app!

but I will have no problem, eg 1973 to 1976 we lived in a remote place in Africa with no telephone even, no postal system (we had to use a PO box), no television (a choice by my mother), no radio, no music, no shops, etc, and it was one of the best eras of my life, as we had to make things happen directly, no automatic stuff. if I wanted music I had to either whistle it or sing it!
and the only music I knew was the songs we were taught at school, I didnt know any pop music, and didnt know any classical music, just a few songs from school eg the national anthem.

and now with AI, people will stop thinking altogether, relying on AI to think for them.

For sure there are better monitors out there than the AOC or Asus. But since your criteria is so strict regarding monitors, there isn't much to choose from. And if you factor in availability as well, then you are left with few choices only.

I would allow the monitor to be less than 27", down to maybe 23", I dont know if that enables any other choices.

but I have now found that the newer LG with DP is good, its with HDMI that the image is a bit blurry, the HDMI signal must be being interpolated with bad effect.
If you want the best, then currently the best 2K, 27" monitor, with flat screen, is;
MSI MPG 271QRX QD-OLED,
specs: https://us.msi.com/Monitor/MPG-271QRX-QD-OLED
review 1: https://www.tweaktown.com/reviews/10681/msi-mpg-271qrx-qd-oled-1440p-360hz-gaming-monitor/index.html
review 2: https://www.pcgamer.com/hardware/gaming-monitors/msi-mpg-271qrx-review/
store: https://www.overclockers.co.uk/msi-...03ms-a-sync-gaming-monitor-mon-msi-01490.html

With this, Asus monitor is outdone by MSI monitor on so many levels, whereby it would be pointless to compare Asus monitor with MSI (like comparing Trabant with Koenigsegg).
I reread your comments on OLED in the earlier topic, there are no negatives then?

I scrutinised the info on the purchase page, and it does look impressive, the high price tag of course means I would think carefully before purchasing, because once you move to the £1000 price, or even £500 price, you dont want to change the decision. because if you did that 3x a day for a year, or 1x a day for 3 years, that would be £1 million!

also I dont want to spend 1000 and then find I should have spent 1200 on some much better OLED, and now 2200 spent.

but I can afford to make a few £200 mistakes and even more £100 mistakes, and a ton of £20 mistakes!: mistakes are a necessary evil, collateral damage. some wisdom is only possible by wasting money on the wrong decisions!

eg in an earlier post, you said:

After all, you've already bought high-end Noctua fans. Why not use them?

I disagree with this philosophically! there are some things one buys, then realises they were a mistake, and it is BETTER to not use them. although as explained, I have now fixed the Noctua fan problem, by having 0 volts up to 65°C and then sharp rise in voltage to 12V at 75°C, and no clicking problem.

an example where people make this mistake, is they buy a load of snack foods in town. they are trying to lose weight. at night the snack food use by date is say yesterday. a choice: to lose weight, best to not eat, and to discard. but that means money lost on the purchase, so they eat the food as it will still be ok, and the weight problem continues! the correct decision is to discard the food, and not feel obligated to eat it just because you bought it.

now the LG with DP is sufficiently good for the moment, the HDMI is substandard! years of suffering with the 2010 mobo as that was HDMI only! not sure if that 2010 mobo could do DP via a graphics card?

but the impressive description of the colours, contrasts, etc, makes me want to get one, but after some careful study.

I have one criticism of the above OLED, which is it is 2560 x 1440 which is lower res than this LG, which is 3840 x 2160, ie 2x linear dimensions of HD.

whereas the above OLED is 1.333333 x linear dimensions of HD, which will lead to interpolation fuzz for my existing HD images.

can you locate a good OLED which is 3840 x 2160?

then it would be all round better than this LG, because right now this LG is ahead on res, ie higher res, and better aligned with HD.

Your PC, need to figure it out.

I don't use M$ Edge at all. Filled with security holes like Swiss cheese and Micro$oft doesn't need to know what i browse. So, i instead use Mozilla Firefox, with plenty of security add-ons. I do have Google Chrome as well, for backup. And Tor too, just in case.
I switch between Edge and Firefox. by default I use firefox, but I find it problematic for amazon.co.uk pages, where often the photos dont load or it takes too long. One handy feature of Edge is "screenshot", which allows a jpeg capture of the entire scrolled webpage, ie including the stuff you have to scroll to! main minus of that, is the jpeg quality deteriorates the longer the webpage is, .png would be much better!

I used to screenshot webpages on Firefox, by taking screenshots at multiple scroll points, and then merging them together into the bigger picture, but now I can do this via Edge in one step.

For sure it doesn't appear since there is no data cable between PSU and MoBo. Only very few digital PSUs have this option. Corsair AXi is once such unit.


MOS = MOSFET, part of the MoBo VRM. So, essentially VRM temperature.

Chipset - near bottom right corner of MoBo. Since it takes up essentially 1/4 of MoBo, it very well may have 2 temp probes to cover it.

System - could be temp probe for ambient temperature inside the PC case. Where that temp probe is exactly located - i don't know, since it differs from MoBo to MoBo. Bottom left corner is one option.


They all can remain under CPU, so that when CPU gets hot, all case fans adjust their speed accordingly.

But if you want to fine-tune it, by allocating each fan to the closest temp probe, then it should be along the lines of:
MOS - rear exhaust fan + 3x top exhaust fans
chipset - bottom intake fan and 2x front intake fans

Do note that reading from MoBo temp probes doesn't fluctuate around as much as CPU temp does. So, you very well might end up in constant fan RPM, without change. E.g my chipset temperature usually stays at 32C.
I found meanwhile that I had misunderstood the control panel, in fact it just tells me the temperature for each zone, it doesnt allow me to associate them! but the info you gave is interesting.

could you annotate a photo or a diagram of where the different things are?

KVM splitter is such hardware that can mirror the video signal. It works in reverse to KVM switch.
any purchase link for such?

ie 1 PC connected to 2 monitors,

when the PC has multiple video sockets, you can put a different monitor on each, but with this one, without graphics card, I think the only video out socket is the one USBC one, I tried another USBC and it doesnt work.

I also tried a USB C hub with HDMI and USB C sockets, and that also doesnt work. just the HDMI works, which I was using for the 2007 monitor.
 

Aeacus

Titan
Ambassador
I would have hoped the hardware did this automatically!

is the hardware deliberately slowing things down unless you select the right options?
Here, you need to contact MSI and ask them about it.

Hardware has it's fixed limitations, which i've shared in this topic. Now, MoBo is made by MSI and it is a possibility that MSI somehow limited the bandwidth software wise, whereby you need another piece of software to unlock the full potential - i don't know. Hence why ask it from MSI.

ok, have done that and so far it seems to be working!
(y)

this has worked, at 200% scaling, the fonts are now readable, but seem to always be smaller than the Desktop and browser fonts.
(y)

now this time I connected it with DP for the first time, until now it was via HDMI, and the screen image is VASTLY better with DP than with HDMI, where it is really sharp, where I can handle a smaller font eg the 200% image above is just about tolerable.
DP is meant to connect monitors to PC, where it has advantages over HDMI. HDMI is used to connect TVs to PC and since you'd be sitting quite far from TV, HDMI doesn't need to show you the crisp image. But when you sit up close, crisp image is needed. Hence why use DP with monitors.

I would allow the monitor to be less than 27", down to maybe 23", I dont know if that enables any other choices.
Let's go over monitor specs once again, so i know what to look for;

Size: 23" to 27"
Resolution: 1080p? or 2K? or 4K?
Screen curvature: None (flat panel)
Panel type: VA? Or quantum-dot LED (QD-OLED)?
Contrast ratio: min 3000:1? (TN and IPS panels have contrast ratio of 1000:1)
Response time: less than 2ms? Preferably 1ms?
Refresh rate: 144 Hz? More? Less?
VESA mount: Yes? No?
Something else?

About monitor size;
Most PC monitors are viewed from ~1m or closer (i look at my monitor at ~70cm or so and i have 23", 1080p, VA panel monitor).
Now, with monitors, 1080p is good for up to 23" screens at viewing distance of ~1m. For 27" 1080p monitor and at ~1m, you may start to see individual pixels. So, to combat that, better pixel density monitor is needed. Next step up would be 1440p (aka 2K). Hence why most 27" monitors are 1440p.
Now, if you stretch the monitor even bigger, to 32", then 1440p at ~1m distance may produce also the issue of you seeing individual pixels. Again, same would happen, for better pixel density, higher reso is needed, which is 4K (aka 2160p). And there's also a reason why most 4K monitors start at 32".

Now, if you were to sit further away from the monitor, e.g 1.5-2m, you can get away with 27" 1080p and 32" 1440p monitor.

Same is with TVs, but since viewing distance of TVs is further away than with PC monitors, TVs can be bigger in size, before you run into issue of seeing individual pixels.

Checked my local store and regarding TVs;
up to 32" are HD (720p reso)
starting from 32" to 43" are FHD (1080p reso)
starting from 43" to 85" are UHD (2160p reso aka 4K)
starting from 85" are UHD-2 (4320p reso aka 8K)

I think you can already see how TV reso goes up, as TV diagonal increases. While viewing distance remains the same.
(And i know that TV manufacturers have their own formula to calculate proper sitting distance based on screen diagonal, but who follows that? :cheese: )

I reread your comments on OLED in the earlier topic, there are no negatives then?
That MSI monitor is QD-OLED aka quantum-dot LED (QLED).

Every display panel type has it's flaws. None are perfect.

E.g PC monitors;
TN panel - washed out colors, terrible contrast ratio, terrible view angle; while offering fastest response time and cheapest price.
IPS panel - long response time, terrible contrast ratio, high price, colors/brightness will fade in time; while offering great color accuracy.
VA panel - no obvious flaws per se, slightly worse color accuracy than IPS, slightly worse view angles than IPS, better response time than IPS but worse than TN. So, "jack of all trades" to say so. But where VA panel excels, is great contrast ratio and durability in terms of keeping it's color accuracy over time.

TVs;
OLED - high price, burn-in issues, low brightness; while offering great color accuracy, consumes less power than other two TV panels and can be very thin.
QLED - high price, high power draw, bloom issues; while offering 0 burn-in issues, great contrast ratio and brightness.
QNED - high price, high power draw; while also offering 0 burn-in issues, great contrast ratio and brightness. But QNED just came out and surely people will find issues with this one as well.

1st OLED TV was produced by LG (back in 2013). Samsung response to that was QLED. Now, LG answered Samsung and came out with the variation of Samsung's QLED, called QNED.
Diff between the three: https://www.makeuseof.com/qned-vs-oled-vs-qled-what-is-the-difference-and-which-is-best/

I doubt that you can tell a diff between QLED and QNED. But since QNED is successor of QLED, it should have better view experience. E.g reduced bloom effect that some QLED panels (especially older ones) suffer.

not sure if that 2010 mobo could do DP via a graphics card?
If GPU has DP output port, 2010 Gigabyte MoBo can do image over DP as well.

can you locate a good OLED which is 3840 x 2160?
I see that you've mixed up OLED and QLED. OLED has severe burn-in issues that i've explained. QLED is quantum-dot LED and has 0 burn-in issues, thus is better than OLED.

These are all QLED monitors with flat panel (7 options),
pcpp: https://uk.pcpartpicker.com/products/monitor/#sort=size&P=9&C=0

As you can see, 27" QLED monitors are 1440p (2K), while 4K monitors are all 32" or bigger.

I switch between Edge and Firefox. by default I use firefox, but I find it problematic for amazon.co.uk pages, where often the photos dont load or it takes too long. One handy feature of Edge is "screenshot", which allows a jpeg capture of the entire scrolled webpage, ie including the stuff you have to scroll to! main minus of that, is the jpeg quality deteriorates the longer the webpage is, .png would be much better!

I used to screenshot webpages on Firefox, by taking screenshots at multiple scroll points, and then merging them together into the bigger picture, but now I can do this via Edge in one step.
There are add-ons for Firefox for that. I also have one such add-on in use.

could you annotate a photo or a diagram of where the different things are?
Here:

QJpIpBu.png


any purchase link for such?
Here's one such item,
amazon UK: https://www.amazon.co.uk/Hopbucan-Two-Way-Splitter-Switcher-Monitor/dp/B0C7G8R87F/

Since it can be used either way, it is both the KVM switch and KVM splitter, depending on which side you use for monitor(s).
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
I am doing a sector backup of my main data SSD, which is a 500G SATA Samsung, to the WD Blue, both via SATA on the new PC and Linux Mint, and it is progressing unbelievably fast, not yet complete.

I found that the WD Blue and the M.2 drive are identical bytesizes, so I can potentially do a verbatim backup of the M.2 to the unused WD Blue. sometimes drives with the "same" size, arent the same size, but only approx the same, eg I think a PNY SSD ages ago was a different size from the Samsungs.

writing this from the 2010 PC with the 2007 LG monitor, was going to use the headphones for sound but found they are USB, where the USB arrangement is dubious, so instead arranged some speakers via the monitors headphone and the HDMI to the monitor. I have a spare pair of speakers, because some years ago I got this intermittent sound overlaid on the audio, and I thought the speakers were kaput, bought some identical ones second hand on ebay, and same problem. after a lot of troubleshooting I found it was the ethernet over mains gizmos causing the problem, namely this product:

https://www.bt.com/help/user-guides...ng/broadband-extenders/broadband-extender-600


I am thinking of reformatting the M.2 and reinstalling Windows 10 from scratch, and maybe allocating 500G for Win10, but I will do some further experiments first. This time I will note each customisation and install necessary for future reference, eg halting the predictive text, switching off one drive, enlarging the fonts, ...

I cannot find a settings and transfer wizard for Windows 10, what I will do is just backup the Firefox bookmarks and passwords, and the Edge favourites, (I call them bookmarks! because not all are "favourite", thus that terminology isnt accurate). As I divert downloads away from the system drive, its just the browser bookmarks, passwords, history that are important, cannot see a way to backup history, neither for Firefox nor for Edge. Edge wants me to backup to onedrive, but I refuse to do that.


Here, you need to contact MSI and ask them about it.


Hardware has it's fixed limitations, which i've shared in this topic. Now, MoBo is made by MSI and it is a possibility that MSI somehow limited the bandwidth software wise, whereby you need another piece of software to unlock the full potential - i don't know. Hence why ask it from MSI.
I will delay for the moment


DP is meant to connect monitors to PC, where it has advantages over HDMI. HDMI is used to connect TVs to PC and since you'd be sitting quite far from TV, HDMI doesn't need to show you the crisp image. But when you sit up close, crisp image is needed. Hence why use DP with monitors.
I wasnt aware of a problem, because the 2007 LG is pixel perfect for HDMI, thus I think the problem is in the monitor, that it must be remapping the image in a worse way for the newer LG.


Let's go over monitor specs once again, so i know what to look for;

Size: 23" to 27"
bigger probably is better, but the 2007 monitor is 23" and is fine.

one disadvantage of bigger screens, is you have to turn your head more, whereas with a smaller screen, you can just move your eyes which is much more efficient.

Resolution: 1080p? or 2K? or 4K?
integer multiples of HD are best, and that probably means powers of 2, ie 2x, 4x, ...
thus I think I'd prefer 4K, at first I thought 2K would be 2x, but reading your later comments, I see that 2K is in fact 1.3333x, and 2x is 4K!

Screen curvature: None (flat panel)
Panel type: VA? Or quantum-dot LED (QD-OLED)?
Contrast ratio: min 3000:1? (TN and IPS panels have contrast ratio of 1000:1)
Response time: less than 2ms? Preferably 1ms?
Refresh rate: 144 Hz? More? Less?
PAL 50Hz is fine, so maybe anything 50Hz or higher. I had a CRT multisynch in the 1990s, and I found that something like 72Hz caused problems for my brain! some frequencies might interfere with the brain's internal frequencies, eg alpha waves, etc.

VESA mount: Yes? No?

if the inbuilt stand is good, then maybe VESA isnt necessary. the newer LG has a dreadful stand, C shaped, where the C faces the user, ie:

monitor C keyboard user

the problem is I often have notes or text on the desk between keyboard and monitor, eg for comparing my notes with some internet info, and that C shape is obstructive. eg I want to have the 2023 laptop between keyboard and monitor, and it wont fit!

now maybe I should get a replacement VESA stand.

Something else?
I think a headphone socket is also important, that makes attaching loudspeakers easier

About monitor size;
Most PC monitors are viewed from ~1m or closer (i look at my monitor at ~70cm or so and i have 23", 1080p, VA panel monitor).
Now, with monitors, 1080p is good for up to 23" screens at viewing distance of ~1m. For 27" 1080p monitor and at ~1m, you may start to see individual pixels. So, to combat that, better pixel density monitor is needed. Next step up would be 1440p (aka 2K). Hence why most 27" monitors are 1440p.

but my newer 27" LG is 3840 x 2160:

https://www.lg.com/uk/monitors/uhd-4k-5k/27ul500-w/

Now, if you stretch the monitor even bigger, to 32", then 1440p at ~1m distance may produce also the issue of you seeing individual pixels. Again, same would happen, for better pixel density, higher reso is needed, which is 4K (aka 2160p). And there's also a reason why most 4K monitors start at 32".

Now, if you were to sit further away from the monitor, e.g 1.5-2m, you can get away with 27" 1080p and 32" 1440p monitor.

Same is with TVs, but since viewing distance of TVs is further away than with PC monitors, TVs can be bigger in size, before you run into issue of seeing individual pixels.

Checked my local store and regarding TVs;
up to 32" are HD (720p reso)
starting from 32" to 43" are FHD (1080p reso)
starting from 43" to 85" are UHD (2160p reso aka 4K)
starting from 85" are UHD-2 (4320p reso aka 8K)

I think you can already see how TV reso goes up, as TV diagonal increases. While viewing distance remains the same.
(And i know that TV manufacturers have their own formula to calculate proper sitting distance based on screen diagonal, but who follows that? :cheese: )
but they might be making assumptions with their sourcing, as my LG is 4K and 27",

eg as there can be so many manufacturers + models of monitors, a firm might decide to narrow down their sourcing, by making some assumptions, which might work for the guy making the decisions, but might not work for 60% of people out there!

eg when I got a laptop in 2023, they guided me to a 13" HP, and that doesnt work for me. when I mentioned this, someone said they refuse to use a laptop less than 16".

That MSI monitor is QD-OLED aka quantum-dot LED (QLED).

Every display panel type has it's flaws. None are perfect.

E.g PC monitors;
TN panel - washed out colors, terrible contrast ratio,
these 2 are the deal breakers!

terrible view angle;



while offering fastest response time
this is less of a problem, as I mostly work with static images, eg text, or photos, but you dont want the response time to be noticeable.

with vision, the centre of the retina has the slowest response time, as you move away from the centre, the response time becomes faster. eg with some fluorescent lights, if you look directly, the light is steady, but if you look at them with peripheral vision they flicker!

at night, stars can be seen clearer if you look at them off centre, this relates to sensitivity rather than response time.


and cheapest price.
we spend more time looking at monitors for using computers than looking at anything else, thus we mustnt just buy cheap!


IPS panel - long response time, terrible contrast ratio, high price,


colors/brightness will fade in time;
this is a problem,


while offering great color accuracy.
VA panel - no obvious flaws per se, slightly worse color accuracy than IPS, slightly worse view angles than IPS, better response time than IPS but worse than TN. So, "jack of all trades" to say so. But where VA panel excels, is great contrast ratio and durability in terms of


keeping it's color accuracy over time.
this is important, you dont want a technology where the colours deteriorate with time,

TVs;
OLED - high price, burn-in issues, low brightness; while offering great color accuracy,


consumes less power than other two TV panels

less important, main problem with higher power is if it needs fans causing noise,

I have solar panels, so in the summer, my electricity is free up till 10pm,

the house draws from the panels first, then any deficit is from the grid,
and it exports any surplus to the grid,

net effect is during the summer, the mains meter stays fixed during the day

in June and July, they can generate 2500 Watts in the middle of the day,
when they were installed approx Aug 2012, one day it was raining, and they were generating 600 Watts!

ie they dont need direct sunlight, and will generate electricity even in rain, only at night does generation go to zero. if you can see the daylight with your eyes, they will generate electricity.

right now 15:36, cloudy with slight rain in March, 634 Watts.

and can be very thin.
QLED - high price, high power draw, bloom issues;

not sure what you mean by "bloom", in ordinary english, blooming is when all the flowers appear, eg at the moment bluebells are blooming. in german, "Blume", means flower, pronounced bloome,


while offering 0 burn-in issues, great contrast ratio and brightness.
QNED - high price, high power draw; while also offering 0 burn-in issues, great contrast ratio and brightness. But QNED just came out and surely people will find issues with this one as well.
I have more faith in Samsung than LG!

my Samsung smartphone, really great colours,

I have an LG frost free fridge, really great, but helping someone get a frost free to their requirements, I guided them to a smaller Samsung, but its freezer compartment is 3 full drawers, whereas with mine, the lowest drawer is less depth as there is fridge mechanism at the back. they can fit much more stuff in their smaller Samsung! they also got something like a 5 year guarantee.

LG is good, Samsung is better!

1st OLED TV was produced by LG (back in 2013). Samsung response to that was QLED. Now, LG answered Samsung and came out with the variation of Samsung's QLED, called QNED.
Diff between the three: https://www.makeuseof.com/qned-vs-oled-vs-qled-what-is-the-difference-and-which-is-best/

I doubt that you can tell a diff between QLED and QNED. But since QNED is successor of QLED, it should have better view experience.

dangerous assumption! more recent by another firm isnt always better, I would prefer to not assume anything and study the facts. sometimes more ancient stuff is better, eg 3D technology has mostly vanished.

can be a confusion tactic, to create a similarly named technology, makes people assume its better!

with second hand comics, "very good" means "very bad"! there are some traders who make a lot of money from this confusion! eg the one trader told me they only trade "very good condition", handing me these bad condition comics! if you want really good condition you want eg "near mint", top quality is "gem mint",

E.g reduced bloom effect that some QLED panels (especially older ones) suffer.


If GPU has DP output port, 2010 Gigabyte MoBo can do image over DP as well.


I see that you've mixed up OLED and QLED.
ok, I made a basic blunder!

OLED has severe burn-in issues that i've explained. QLED is quantum-dot LED and has 0 burn-in issues, thus is better than OLED.

These are all QLED monitors with flat panel (7 options),
pcpp: https://uk.pcpartpicker.com/products/monitor/#sort=size&P=9&C=0

As you can see, 27" QLED monitors are 1440p (2K), while 4K monitors are all 32" or bigger.
how is pcpartpicker managed?

do they actively scan all models on the market, or do they filter at all?

perhaps the QLED technology has a lower size limit of the pixels, and maybe that is why they are 2K at 27" rather than 4K like my newer 27" LG? where the LG maybe can do smaller pixels via the technology spread out depthwise, but with less impressive colours.

I think colour is more important than resolution, eg the Sony Trinitron technology went for a lower resolution but much better and more stable picture. the earlier technology was a hexagonal array of r, g, b pixels. but with trinitron, they had black vertical bars between the pixels, and much better image, where you could see the pixels more easily than the earlier technology. with the earlier technology, the pixels would seem to be swirling around as you looked at them, no such problem with Trinitron. Trinitron also went for a subset of a cylinder for the surface, rather than a subset of a more spherical surface. Trinitron only curved left to right, the earlier CRTs curved both up + down and left + right.

using the comparison feature, I'd say the ASUS ROG is the best,

the MSI one is only 250 cd/m2, the ASUS ROG is the only one with both forms of frame sync, it has 4 sockets 2x HDMI, DP, and USBC,

only the Gigabyte FO32U2P has more, with a mini-Displayport also, and only the 2 Gigabytes have inbuilt speakers, but that is less important.

when I want really high quality sound, I connect the headphone socket of the PC to my 1980s amplifier, and listen on ginormous 1980s hifi speakers and the sound is sublime!

32" is a bit big, I wouldnt be able to have the loudspeakers on the sides of it!

note that, although you might have a big screen, its only the central area of the retina which is really high res, peripheral vision is less precise. eg if I look at a word on the screen with a smaller font, I can see the entire word really precisely, but the nearby words are less precise, I can read them but as you go further away from the central vision, they become less readable.

I need to think further, maybe if I look at some in town to get an idea. I might go for a 27", eg that earlier one you recommended.


There are add-ons for Firefox for that. I also have one such add-on in use.
what is the add on called? as I may try to install it, eg if its more flexible than the inbuilt Edge feature as that one is lower quality image if the webpage is longer.


Here:

QJpIpBu.png



Here's one such item,
amazon UK: https://www.amazon.co.uk/Hopbucan-Two-Way-Splitter-Switcher-Monitor/dp/B0C7G8R87F/

Since it can be used either way, it is both the KVM switch and KVM splitter, depending on which side you use for monitor(s).
sounds interesting to be bidirectional!

2 PCs to one monitor could be useful for using DP from both the laptop and new PC,
and 2 monitors to one PC will be useful for using without installing Linux Mint, as I need the wired mouse and keyboard to arrange the bluetooth ones

Like i said, to each their own.

I dont completely agree with this philosophy!

sometimes I wish people had told me earlier about things I did,

an anecdote: there was an MBA student locally, where he would go outdoors every few minutes, to have another cigarette. I told him smoking is bad, and every puff of the cigarette deteriorates the health of the throat and the bronchus surface etc,

He said he isnt going to quit smoking, its what he does, and he enjoys it.

then some months later I met him on the bus, and he told me he had quit smoking!

I think if someone does things which you dont approve of, you should at least tell them, but you should then leave them to make their own decision.

I once got talked into a Kirby Avalir hoover, for something like £800, I thought I had made a great decision, and was enthusing about it to the guy from 1995. His wife then said scornfully: nobody pays £800 for a vacuum cleaner!

I am glad she didnt restrain her opinion, because that then planted the seed of doubt in my mind. I decided to research the matter, as 14 day refund period because of UK laws. Went to town, and found that a Dyson and a Vax were both A rated on everything, and at most £135. then with difficulty found the ratings of the Avalir, and they were dreadful, eg I think an F on one rating. So I got a refund, and bought a Vax, and its really great.

the Vax uses Dyson's technology, because he made the mistake of patenting it, and the patents expire after I think 20 years! Nobody ever replicated Coca Cola, because they never patented their drink.
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
Here's one such item,
amazon UK: https://www.amazon.co.uk/Hopbucan-Two-Way-Splitter-Switcher-Monitor/dp/B0C7G8R87F/

Since it can be used either way, it is both the KVM switch and KVM splitter, depending on which side you use for monitor(s).

have placed an order for this, and another identical M.2 drive, where I will have a 2T M.2 at each of M2_2 and M2_3 and then install the GPU,

the PC has become faster by using DP, with HDMI it was like my 2004 HP Pavilion with Intel Celeron!

but now with DP it is good.

the 500G Samsung SSD to WD Blue compressed sector copy via Sata on the new PC, took about 2 hrs 37 minutes,

the shell info:

Wed Apr 10 11:52:32 UTC 2024 976773168+0 records in 976773168+0 records out 500107862016 bytes (500 GB, 466 GiB) copied, 9407.45 s, 53.2 MB/s Wed Apr 10 14:29:20 UTC 2024 $

this isnt just copying, but is compressing also,


the earlier

MSI 27" MPG 271QRX QD-OLED​

is out of stock at all venues!

https://uk.pcpartpicker.com/product...2560-x-1440-360-hz-monitor-mpg-271qrx-qd-oled
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
If you want the best, then currently the best 2K, 27" monitor, with flat screen, is;
MSI MPG 271QRX QD-OLED,
specs: https://us.msi.com/Monitor/MPG-271QRX-QD-OLED
review 1: https://www.tweaktown.com/reviews/10681/msi-mpg-271qrx-qd-oled-1440p-360hz-gaming-monitor/index.html

this review gives some negatives eg "Blurry text" and "random pixel artifacting",

the article was published March 6th, and says:

-------------------------------------------
If you happened to catch any news out of CES 2024, you would have heard almost every company that makes monitors is releasing a version of the new QD-OLED panel,
----------------------------------------------

but this suggests its a new technology, it is unwise to buy in to new technology, if you wait, the technology improves and the prices drop.

he also says:

--------------------------------------
While that is all extremely positive, there are some things worth mentioning that I found to be negatives. Throughout my testing, I noticed seemingly arbitrary pixel lines turning to white and then reverting back to their designated color. This was an issue that I didn't notice immediately, but then, after I noticed it and began looking for it more, I found it was occurring much more frequently. I have provided some GIFs of the noticed issue below, and unfortunately, I was unable to find a fix for the problem throughout my testing.
--------------------------------------

that problem is a phenomenon you'd see with the Amiga 500's blitter chip,

where you'd see rectangles of colour "billowing" like a flag in a breeze.

its basically because the display rate is too fast, and is faster than the chips can move the data,

even if your eye cant see 360Hz, the conflict of data rates may be very visible!

with the Amiga, the 50Hz Pal frame rate overtakes the blitter moving say lots of rectangles, where its a coherency problem.

you'd see the problem also with arcade machines of that era, approx 1988, where they had really impressive 3D graphics, using purpose built computers, and you'd literally perceive the pixels being sprayed onto the video screen!

in those days, those artefacts were regarded as impressive, like a car skidding (B) or a driver turning the steering wheel too fast (A), when in fact such things are that the car is going too fast: you'd fail the driving test because you have "lost control" of the vehicle for (A), and the vehicle has lost control of the road with (B). both of which are from deficient skill. for the test you are advised to keep both hands on the steering wheel except when say changing gear or other such unavoidable action.

the way coherency with computers occurs, or more accurately incoherency, is that when a computer modifies data, the data is incoherent whilst the data is being modified.

its like if you redid the wallpaper of a room, then before you start, the room is coherent, and after you finish also coherent, but during the wallpapering, it is incoherent, people cannot use the room normally. the room is incoherent when it cannot be used freely. with the old wallpaper it is coherent.

now if people only use the room say 9am to 5pm, if you wallpaper it after hours, it is always coherent for the users of the room.

the above problem looks like a coherency problem, where the display rate is out of synch with the data generation,

incoherency more generally is when the different parts of something dont combine into a bigger picture, eg a collage.

the Amiga actually supplied a nice double buffering system which if programs were written properly would always supply coherent graphics, regardless of frame rates or data processing rates, and if they kept the data rates sufficiently smaller than the frame rate, led to implausibly smooth scrolling, which the Atari ST couldnt do as it didnt have the hardware subtleties.

basically you have 2 zones of memory: image1, image2,

the graphics hardware displays image1, whilst image2 is being rendered. when image2 is being rendered, the Amiga custom chips allowed you to wait till the video beam had completed displaying the current frame, where you now load the memory address of image2, so this one is now coherently displayed. and whilst being displayed you re-render image1.

you could thus guarantee that a scan of the CRT screen was ALWAYS coherent,

it is thus partly a programming problem, that the game the guy is testing hasnt been written for such a high frame rate, causing artefacts.

its like me using the ancient Photoshop version on a 3840 x 2160 screen which it hasnt been written for. but where I managed to reconfigure the OS for.

similarly with that game, he may be able to fix the problem by slowing down the refresh rate!

and he also says:
------------------------------------
Another downside worth mentioning about the 271QRX is the text rendering. Most users who aren't dancing between office tasks that include a lot of text and are exclusively gaming likely won't even notice this issue. However, those who do switch between games and heavy text workflows should know the text rendering on QD-OLED panels isn't very clear.
This issue can be attributed to the sub-pixel layout, which is the specific layout of Red, Green, and Blue (RGB). The specific issue is called text fringing, which is essentially the occurrence of a white glow around the outside of the text. This problem is particularly noticeable when typing white text onto a black background document.
These second-generation QD-OLED panels have made improvements to text rendering compared to the first generation, but the problem is still noticeable and may result in eye strain over long periods of reading/writing.
------------------------------------

I checked with a yardstick, and 31.5" is only a bit bigger than 27", and basically 31.5" at 4K gives significantly smaller pixels.

27" 2K pixels at 31.5" ought to be 2560 x 31.5/27 = 2987 pixel width,
but in fact we get 3840, thus we get an extra 853 pixels in that width, which is 28.6% more pixels than the 27".

thus I think 31.5" is better, and I will probably wait for the technology to stabilise and prices to drop.

the 27" MSI is expected 24th April, the ASUS ROG 29th May by one seller. so you cant even buy either now, thus it is best to wait some months, not weeks or days!
 
Last edited:

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
the problem the reviewer mentions of the text is probably a similar problem,

that 1.33333 x (1920 x 1080) is out of synch with HD, whereas 4K=2x(1920 x 1080) is in synch

which is my original objection to 2K and non integer multiples of HD.

basically say you had some consecutive pixel perfect text:

BWBWBW

if you rescale by 1.33333... ie 4/3, the BWB becomes divided into 4 pixels, which I delineate by brackets:

(B x .75) (B x .25 + W x .5) (W x .5+B x .25)(B x .75)

continuing the WBW becomes (Wx.75)(Wx.25+Bx.5)(Bx.5+Wx.25)(Wx.75)

net effect, the precise BWBWBW becomes

BwwBWbbW

where w means a shade of grey biassed to white, namely 75% white, and b means a shade of grey biassed to black, namely 75% black.

and you get irritating text.

but with 4K you'd get BBWWBBWWBBWW
and this would be precise.

even if software is written more carefully, sometimes it is really tricky to get say pixel size versus image size right, and eg with my HP laptop, really nonstandard 3000 x 2000 res, its basically IMPOSSIBLE to get some things right, because at the low level the ratios are just wrong!

I am not going to ever get a 2K monitor!
 

Aeacus

Titan
Ambassador
thus I think I'd prefer 4K, at first I thought 2K would be 2x, but reading your later comments, I see that 2K is in fact 1.3333x, and 2x is 4K!
720p = 1280x720 (HD)
1080p = 1920x1080 (Full HD)
2K = 2560x1440 (Quad HD)
4K = 3840x2160 (Ultra HD)

2K (1440p) is called Quad HD since it is 4x the pixels of 720p (HD). 2K reso is commonly used in smart phones, PC monitors and console screens.

While for TVs, 2K reso isn't used at all. Moreover, broadband resolutions for TVs are 720p, 1080p, 4K and 8K. So, without anyone even making content in 2K (1440p), there is 0 point to make TV that natively displays 2K.

thus I think 31.5" is better, and I will probably wait for the technology to stabilise and prices to drop.
If you don't have urgent need for newer/bigger/better monitor, sure, you can wait.

if the inbuilt stand is good, then maybe VESA isnt necessary. the newer LG has a dreadful stand, C shaped, where the C faces the user
Being "good" is subjective when it comes to monitor stands. The trifecta that all monitor stands should have, would be: height, tilt and swivel adjustment. Seeing all 3x is rare when it comes to monitors. Even rarer is 90 degree rotation, from landscape to portrait mode (e.g the Samsung monitor for my Haswell build does have the 90 degree rotation mode as well). Usually it's one or two adjustment options, but not all three.

So, to actually get all 3x adjustment modes, it is always good to buy a monitor that has VESA mounting holes. Since when you're unhappy with the stand (e.g can't change height, tilt or swivel, when you need), you can just mount the monitor to VESA monitor arm and adjust the monitor as you see fit. Also, many monitor arms free up desk space since they are either connected to the edge of the desk or to the wall.

And then there is the issue with the shape of the stand itself, namely the legs part of it.
E.g my MSI monitor stands on a tripod,
official pics: https://www.msi.com/Monitor/Optix-MAG241CR/Gallery

And due to the free space under the stand, i can keep my smart phone (or other stuff) there.

not sure what you mean by "bloom", in ordinary english, blooming is when all the flowers appear, eg at the moment bluebells are blooming. in german, "Blume", means flower, pronounced bloome,
Blooming refers to light spilling onto darker parts of the display, causing brighter objects to appear to have a “halo” around them. For traditional displays that don't support local dimming, this phenomenon is less likely to occur.
And video explanation as well;
View: https://www.youtube.com/watch?v=xAacPPv54nA


how is pcpartpicker managed?

do they actively scan all models on the market, or do they filter at all?
Like so: https://pcpartpicker.com/about/

PCPP, while not perfect, is good enough. Also, it is the best this kind of service provider and it offers prices (buying options) for many countries. Currently 23 in total.

However, sometimes, the prices and/or stock availability isn't accurate to the retail site PCPP links to. So, you actually have to go to the linked retail page to confirm it by yourself. This is to do with the interval PCPP updates it's prices/availability database.

As of what kind of hardware they do display, i can't tell. I've seen some very obscure hardware listed on PCPP while some other, more common ones, are missing. But users can add custom parts to the PCPP listing, when that item isn't in the database. Of course, it adds that custom part to that list but doesn't add to the database itself. For permanent addition to database, more in-depth process is used, to validate the legitimacy of the item.

what is the add on called? as I may try to install it, eg if its more flexible than the inbuilt Edge feature as that one is lower quality image if the webpage is longer.
Well, mine is ScreenGrab! but i did look it up and currently, it is EOL. Still, the plugin works for me.

For alternative, FireShot seems to be same or even better what i have,
link: https://addons.mozilla.org/en-US/fi...la.org&utm_medium=referral&utm_content=search

I dont completely agree with this philosophy!
"To each their own", means that every person is free to decide what to do. It doesn't restrict others telling you new info. But in the end of the day, it's you who decides what to do or use.

There are similar quotes to this one, like:
"Not my cup of tea."
"Whatever rocks your boat."
"Live and let live."
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
progress update:

I will be collecting the 2nd identical M.2 drive and also the freestanding optical drive later today, ie ready for collection. I try to get everything via collection for 2 reasons:
1. I dont have to wait around all day for the courier
2. it is often cheaper! eg I bought a version of Norton 360 which has anti track included, where it was some £20 for 5 devices, and 4.99 CHEAPER by collection only! as amazon dont have to send their van to this address just for one cardboard envelope!

Now I am thinking of getting a PCIe 5.0 x 4 M.2 2280 drive to experiment with, as you said it doesnt eat from the GPU bandwidth.

I realise there is a bottleneck dilemma, that you need 2 sockets with that speed to properly utilise it,

but potentially the PCI_E2 with PCIe 5.0 x 8
and PCI_E3 with PCIe 5.0 x 4 (at the cost of the USB 3.2 Gen 2x2 at the back, socket 2 on p23 of the mobo manual.)

could supply a 2nd and maybe a 3rd M.2 drive of PCIe 5.0 x 4.

I think PCI_E1 must be reserved for the graphics card, which I will try to install after I install todays 2nd M.2 drive.

Now I am a bit confused about the M.2 expander for the PCI_2 slot, eg the mobo manual in its specification from p16 onwards doesnt mention the expander, at least I cannot locate it. and the contents page only says how to install it, and doesnt even say how many M.2 sockets it has. but does use the plural on p40 saying "you need to install M.2 SSDs to the M.2 Xpander .... And then install the M.2 Xpander.....". which is ambiguous, but could be interpreted as meaning it has more than 1 socket.

as PCI_E2 is PCIe 5.0 x 8, does that mean it could potentially take a PCIe 5.0 x 8 M.2 drive? or is it as two PCIe 5.0 x 4 sockets?

the PCI_E slots are all from the CPU, so I presume all can go at full speed simultaneously?

for the moment for M2_1, could you give some recommendations disregarding price, for the best PCIe 5.0 x 4 ones at say 1T, 2T, 4T, 8T, .... ?

where probably I will go for a 4T or an 8T. I want to see also what capacity you can get where the price/terabyte either drops or only rises by a small amount. I think you said the really large capacities are much more expensive per terabyte. I think those higher prices are to recoup the R&D costs, so eventually the prices will drop!

now if its a new technology, I might wait for prices to drop, as its not urgent.

but if you recommend some, even astronomically priced ones, I can monitor them till maybe the prices drop.

on a different matter, I went to install Windows 11, which needs a bluray or a dual layer DVD as it wont fit on a DVD, and my bluray writing software doesnt work on 64 bit Windows. the iso was on a 2T Samsung drive, but that doesnt function on XP where the bluray writer works. I have to write to the USB bluray drive, as the SATA one is now on the new PC, and I dont want to disconnect and reconnect without good reason, although the new USB3 external writer arrives today.

so I couldnt write from the 2T Samsung. I tried copying the Windows 11 iso to a flash drive with plenty of space from the new PC, but the flash drive doesnt allow the iso to copy as its too big, more than 6G.

eventually I got round the problem, by copying the file from the 2010 PC with 32 bit Windows 10, to a "scratch" partition on the system drive of that. then rebooting to XP, and now have that bluray written, not yet tried installng it. by "scratch" I mean a temporary workspace, eg to decompress an archive to, eg for an installation, then delete it all later, or delete everything when full.

is there any bluray writing software for Windows 10 + 11 64 bit that you recommend, even more importantly, software which will write 3 layer and quad layer blurays?

for 32 bit I was using Roxio 10, but Roxio now has been taken over by Corel, and the versions are below 10, and its not as good as it was!


eg I bought a batch of 10 bluray rewritable (RE) 3 layer (TL) from Japan, averaging £8.25 each, where these are some 100 Gigabytes.

actually ordinary 25 Gig blurays, I can get for 80p each including delivery, which would be £3.20 for 100 Gigabytes which is much better value for money, but having 100 Gig on one disk could be handy.

if a system is designed properly, the same software ought to handle future increases in capacity, where the increase is just a variable.


720p = 1280x720 (HD)
1080p = 1920x1080 (Full HD)
2K = 2560x1440 (Quad HD)
4K = 3840x2160 (Ultra HD)
ok, I have made a note, its a bit confusing! I always thought 1080p was "HD"!

I prefer linear dimensions, where with the above 720p to 1080p is 1.5x=3/2 , 1080p to 2K is 1.33333x = 4/3, 1080p to 4K is 2x.

for me 1080p is the frame of reference, so 720p = 2/3, 2K=4/3, 4K=2x

I think because 1080p was so much better than 720p, and was the first res that was good, I tend to think relative to it, which is anachronistic, but as 720p is basically obsolete, its a practical perspective.

I think you want integer x 1080p, and in fact power-of-2 x 1080p,

(linearly), ie 1080p, 2x 1080p = 4K, 4x 1080p, 8x 1080p etc,

because then you can always scale up pixel perfect, scaling down would need averaging, and lead to problems, but less problems than other ratios.

whereas if you went for 3x 1080p, you run into problems scaling that up to 4x, as the factor is now 4/3.

with laptops, these often arent full height, where you want at least the pixel width to be power-of-two x 1080.

I know someone with a Windows 7 laptop, and webpages can be problematic as sometimes buttons cannot be reached, where they have a pop up window with a button you cannot scroll to.

with 1080p, dont they sometimes also refer to 1080i?

what is the idea?



2K (1440p) is called Quad HD since it is 4x the pixels of 720p (HD). 2K reso is commonly used in smart phones, PC monitors and console screens.
smartphones are so variable, that things are programmed differently, and also its less satisfactory than desktops, because of the lack of standardisation!

once you standardise the resolution, you can optimise a work for that specific resolution.

fonts are particularly tricky, as a font can be say just 7 pixels wide. but eg some such fonts happen on my 2023 laptop early boot, and they are so tiny I need a lens to read them!

when I reinstall the Amiga emulator to a new PC, I have to experiment a lot to find an Amiga font that works well for me. its a very fiddly problem.

I designed my own font pixel by pixel for when I programmed the 32 bit x86 hardware directly, where the font is fixed width, where every character fits in the identical space. I forget how many pixels by how many, where I worked to get the minimum dimensions where all characters would look ok, the trickier ones are eg m, w, M, W, Q, &, %, ...
the m character means you need at least 5 pixels width plus space. E means you need at least 5 pixels of height plus space. if you want y, g to extend below the line, that is then further pixels of height. I designed fonts for all printable ascii characters, pixel by pixel.

rescalable fonts are dubious when the pixel size is very small, really you need to design each low res font by hand. with low res rescalables, its best to only rescale by integer factors, otherwise can be unsatisfactory. when I say low res, I really mean small pixel width + height.

but change the resolution, and you get chaos, the font now is too small and if you scale it up by 2x, it could be too big.

today the resolutions are so high, that scaleable fonts may be ok.

I think the super high frame rate is a dubious idea, because if it is faster than the image generation rate, you can get visual incoherency problems.

I think with that monitor review, in fact there is nothing wrong with the monitor nor with the software, but the software hasnt been written for such a high frame rate. so a slower frame rate may well be better.

its a form of "race condition" where one item is too fast for the other, and sneaks in ahead of it, where you have to hold things back otherwise you get problems. eg the double booking problem, where an aeroplane has 1 seat left. 2 travel agents check the central database, and find 1 seat free, which they both sell to their customer. so 2 tickets are sold for 1 seat! the central database has to only allow one travel agent to access an unbooked seat at a time. CPU hardware eg x86 has special instructions to enable one to guarantee only 1 core has access to something. its not enough for a CPU to have + - * /, etc, it also needs some special instructions for this kind of thing.


there is no point having a frame rate way beyond what you can perceive,

the webpages on QLED talk of the heat generation, I think you are just generating heat unnecessarily!

faster frame rate, probably does need faster electronics, which then is probably more heat.

better to just have a slower frame rate, yet still fast, and reallocate the bandwidth to other things, or a cooler machine.

honestly this LG's frame rate is just fine, although I havent looked at much video with it, where I think it is 60Hz:

https://www.lg.com/uk/monitors/uhd-4k-5k/27ul500-w/

colour blind people will have much faster response time of their retina, so you might find the gamers who can handle vastly faster frame rates are colour blind.

with soccer, they tend to use left footed people for strikers on the left of the field.

monochrome tv and film is much higher resolution, easily 3x linear resolution. where for really high quality colour, it may be better to send the image to 3 monochrome sensors via filters.

really high frame rates are like having a car which can do 700mph, but which can only be driven at 30mph because the road curves around too much. where you could only drive at 700mph on a specially designed really low curvature road!

basically the higher the speed limit, the slower the curvature of the road,
eg with 70mph motorways, the road has to curve really slowly, where they are forced to curve the road more they will put say a 60mph or 50mph or 40mph or 30mph speed limit,

otherwise you will fly off a tangent to the road!

with motorways, you can generally shut your eyes for a few seconds without problem!

or going shopping in a lorry, when the big problem is you wont be able to fill the lorry, and problematic to park it!

bigger isnt necessarily better, often optimality is a balancing act, and where its better for things to be sufficiently big not maximally big.

I went on a ferry from Harwich to Rotterdam long ago, and they had an entire shopping mall and cinema on the ferry! the problem is for such a journey, which was overnight, I wasnt in the mood to watch a film!

the return ferry was totally different, no cinema!


While for TVs, 2K reso isn't used at all. Moreover, broadband resolutions for TVs are 720p, 1080p, 4K and 8K. So, without anyone even making content in 2K (1440p), there is 0 point to make TV that natively displays 2K.

If you don't have urgent need for newer/bigger/better monitor, sure, you can wait.
as the LG with DP is good, I will wait. with HDMI I wouldnt wait, as it is dreadful.


Being "good" is subjective when it comes to monitor stands. The trifecta that all monitor stands should have, would be: height, tilt and swivel adjustment.

with this LG, I would add that you dont want the stand to obstruct the table.
with this LG, the C shape is the problem, they should have used a [ shape,

Seeing all 3x is rare when it comes to monitors. Even rarer is 90 degree rotation, from landscape to portrait mode (e.g the Samsung monitor for my Haswell build does have the 90 degree rotation mode as well). Usually it's one or two adjustment options, but not all three.

90 degree would need the centre point higher, which probably needs a height adjustment as well. where you shift it higher, then rotate 90.

but with bigger screens, 90 degree is less important,

main thing people want is for an A4 page to fit on the monitor in portrait orientation, where you want the monitor screen height to be at least 29.7cm.

90 degree could be important for eg railway timetable displays,

this LG just has tilt control, where you can tilt it to be perpendicular to your line of sight to the centre of the screen.

So, to actually get all 3x adjustment modes, it is always good to buy a monitor that has VESA mounting holes. Since when you're unhappy with the stand (e.g can't change height, tilt or swivel, when you need), you can just mount the monitor to VESA monitor arm and adjust the monitor as you see fit. Also, many monitor arms free up desk space since they are either connected to the edge of the desk or to the wall.
if its at the edge of the desk, that frees up all desktop space, BUT you need to have a desk of bespoke depth, eg have some MDF cut to a carefully chosen width! unless it can extend from the stand variable amounts over the table, where you could use the desktop space under the monitor.


And then there is the issue with the shape of the stand itself, namely the legs part of it.
E.g my MSI monitor stands on a tripod,
official pics: https://www.msi.com/Monitor/Optix-MAG241CR/Gallery

And due to the free space under the stand, i can keep my smart phone (or other stuff) there.

looking at that, it appears the stand doesnt extend beyond the front of the monitor plane?

that would be good design, my 2 LG's stands both extend beyond the plane of the monitor.

And video explanation as well;
View: https://www.youtube.com/watch?v=xAacPPv54nA



Like so: https://pcpartpicker.com/about/

PCPP, while not perfect, is good enough. Also, it is the best this kind of service provider and it offers prices (buying options) for many countries. Currently 23 in total.

However, sometimes, the prices and/or stock availability isn't accurate to the retail site PCPP links to. So, you actually have to go to the linked retail page to confirm it by yourself. This is to do with the interval PCPP updates it's prices/availability database.

As of what kind of hardware they do display, i can't tell. I've seen some very obscure hardware listed on PCPP while some other, more common ones, are missing. But users can add custom parts to the PCPP listing, when that item isn't in the database. Of course, it adds that custom part to that list but doesn't add to the database itself. For permanent addition to database, more in-depth process is used, to validate the legitimacy of the item.


Well, mine is ScreenGrab! but i did look it up and currently, it is EOL. Still, the plugin works for me.
EOL = "end of the line"?

I looked for "Screengrab!" on the Firefox "add ons and themes" area, and it gives a ton of ones, so I might research some!

For alternative, FireShot seems to be same or even better what i have,
link: https://addons.mozilla.org/en-US/fi...la.org&utm_medium=referral&utm_content=search

"To each their own", means that every person is free to decide what to do. It doesn't restrict others telling you new info. But in the end of the day, it's you who decides what to do or use.

There are similar quotes to this one, like:
"Not my cup of tea."
"Whatever rocks your boat."
"Live and let live."
yes, I think one should say if one thinks there is a better way, but to leave the person to then make an informed decision in their own time.

often when you are told something, it takes some time to understand why the idea is good, on the surface it may seem bad.

a lot of stuff we do is because often we have to do something, and we just arent aware of other options. eg I wanted a monitor, so I went to PC World, asked the salesman and he guided me to this monitor!

I didnt think of asking online. Now with the PC, I did ask online because the shop didnt even sell tower cases, and I couldnt find anything satisfactory eg on ebay, so I had no option but to ask online.

 

Aeacus

Titan
Ambassador
as PCI_E2 is PCIe 5.0 x 8, does that mean it could potentially take a PCIe 5.0 x 8 M.2 drive? or is it as two PCIe 5.0 x 4 sockets?
There is no M.2 drive in the world that utilizes 8x lanes (or 16x lanes). All what we currently have, are mainly utilizing 4x lanes. There are some 2x and 1x lane ones as well (e.g wi-fi modules).

the PCI_E slots are all from the CPU, so I presume all can go at full speed simultaneously?
Yes.

for the moment for M2_1, could you give some recommendations disregarding price, for the best PCIe 5.0 x 4 ones at say 1T, 2T, 4T, 8T, .... ?
"Best" is vague term. Since with M.2 drives, they have several different metrics and none of the drives have all metrics at highest, whereby the drive being "the best". Those metrics are: read speeds, write speeds, durability, reliability, heat generation, power consumption, capacity, price (namely price to performance ratio aka value).

The fastest read/write speed PCI-E 5.0 x4 drive currently is Crucial T705 (up to 14500 MB/s read and 12700 MB/s write speeds),
review: https://www.tomshardware.com/pc-components/ssds/crucial-t705-2tb-ssd-review
pcpp: https://uk.pcpartpicker.com/product...e-50-x4-nvme-solid-state-drive-ct4000t705ssd5

At current moment, PCI-E 5.0 x4 drives are in capacity of: 1TB, 2TB and 4TB. There are no 8TB drives for PCI-E 5.0.
If you want 8TB drive, you have to look at PCI-E 4.0 x4 drive.

Best (fastest read/write) 8TB PCI-E 4.0 x4 drive is Corsair MP600 Pro XT (up to 7000 MB/s read and 6100 MB/s write speeds),
review: https://www.tomshardware.com/reviews/corsair-mp600-pro-xt-ssd-review-corsairs-best-just-leveled-up
pcpp: https://uk.pcpartpicker.com/product...4-nvme-solid-state-drive-cssd-f8000gbmp600pxt

I want to see also what capacity you can get where the price/terabyte either drops or only rises by a small amount. I think you said the really large capacities are much more expensive per terabyte. I think those higher prices are to recoup the R&D costs, so eventually the prices will drop!
If you go by the lowest price per GB, you have to give up on other aspects. Like read/write speeds and/or durability/reliability. Maybe even on heat/power aspect.

Value wise, value king would be Silicon Power UD90 PCI-E 4.0 x4 drives, both the 2TB and 4TB variants, whereby price per GB is £0.05.
review: https://www.tomshardware.com/reviews/silicon-power-ud90-ssd-review
pcpp: https://uk.pcpartpicker.com/products/compare/f4cG3C,2BvD4D/

In comparison, Corsair MP600 Pro XT 8TB drive has price per GB at £0.12.
And Crucial T705 4TB drive has price per GB at £0.154.

is there any bluray writing software for Windows 10 + 11 64 bit that you recommend, even more importantly, software which will write 3 layer and quad layer blurays?
Blu-ray discs are essentially obsolete, replaced by USB thumb drives.

4x layer Blu-ray disc can hold up to 128 GB. At current date, that's peanuts. Especially since one can buy cheap USB thumb drive that holds 256 GB or 512 GB. Biggest capacity thumb drives currently are up to 2 TB in size.

With Blu-ray disc, you need dedicated hardware to read and write them (ODD) + the blank disc as well, to write data on. Moreover, any data written to Blu-ray disc is read-only and can not be changed. But you can add more to it when it has free space left. Still, burning the disc takes quite a bit of time and IF the burn should fail, data written will be corrupt, without 0 chance to rectify it. Not a reliable concept if you ask me. Moreover, blu-ray disc itself is quite big (dimensions wise) and usually is needed to keep it in secure container, so that the disc doesn't get damaged.

USB thumb drives, in the other hand, are like standard storage drives, but in FAR smaller dimensions. Also, any PC that has USB port (usually type-A), can access and use USB thumb drive, without issues. Best part, if the data written to USB drive should become corrupt, you can erase it and write again. Or even format entire drive and start anew. Far reliable concept if you ask me. Especially when compared to CD/DVD/Blu-ray discs.

Current best USB thumb drives,
article: https://www.tomshardware.com/best-picks/best-flash-drives
E.g Buffalo SSD-PUT 2TB, amazon UK: https://www.amazon.co.uk/dp/B09XS1972J/

But to answer your question regarding burning software;
I've always used Nero Burning ROM do burn my CD/DVD discs. Now, i haven't burned anything in years (despite having many blank discs and ODD as well) since they are obsolete. I have 128 GB USB thumb drive for that. Only minor use is to burn audio CD that i can listen in my car. But i haven't bothered. I already have several good audio album CDs (legit ones) that i listen to.
Nero Burning ROM, wiki: https://en.wikipedia.org/wiki/Nero_Burning_ROM

on a different matter, I went to install Windows 11, which needs a bluray or a dual layer DVD as it wont fit on a DVD, and my bluray writing software doesnt work on 64 bit Windows. the iso was on a 2T Samsung drive, but that doesnt function on XP where the bluray writer works. I have to write to the USB bluray drive, as the SATA one is now on the new PC, and I dont want to disconnect and reconnect without good reason, although the new USB3 external writer arrives today.
Why burn the *.iso to DVD? Why not use USB thumb drive instead?

In-depth guide on how to install Win11 here: https://forums.tomshardware.com/threads/windows-11-clean-install-tutorial.3831442/

I prefer linear dimensions, where with the above 720p to 1080p is 1.5x=3/2 , 1080p to 2K is 1.33333x = 4/3, 1080p to 4K is 2x.
Monitor/TV resolutions can't be linear by the power of two. Instead, they follow the aspect ratio of 16:9 (1.78:1).
Further reading: https://en.wikipedia.org/wiki/16:9_aspect_ratio

Before the widescreen (16:9), the square-ish aspect ratio was common, namely 4:3 (e.g CRT monitors).

372px-Aspect_Ratio_Chart.svg.png


with 1080p, dont they sometimes also refer to 1080i?

what is the idea?
There are differences between 1080p and 1080i, namely:
1080i (also known as BT.709) is a combination of frame resolution and scan type. 1080i is used in high-definition television (HDTV) and high-definition video. The number "1080" refers to the number of horizontal lines on the screen. The "i" is an abbreviation for "interlaced"; this indicates that only the even lines of each frame, then only the odd lines, are drawn alternately, so that only half the number of lines are ever updated at once. A related display resolution is 1080p, which also has 1080 lines of resolution; the "p" refers to progressive scan, which indicates that each full frame appears on the screen in sequence.

where you could only drive at 700mph on a specially designed really low curvature road!
Road doesn't need to have low curvature for one to drive fast. Instead, curvature can be quite decent, but what stops going straight, is when you rise the road bank angle. Up to 90 degrees and it has been made on several places. Namely, vehicle test tracks have 90 degree high-bank curves.

E.g one in Germany, Mercedes-Benz test track:

einfahrbahn-3-w1920xh1080-cutout.jpg


Source + further reading about that test track: https://group.mercedes-benz.com/com...bahn-untertuerkheimspecial-topic-article.html

Even some high-speed trains tilt themselves when they go around the curve, so that they can maintain the higher speed.
Further reading: https://en.wikipedia.org/wiki/Tilting_train

main thing people want is for an A4 page to fit on the monitor in portrait orientation
Not only that, but 90 degree rotated monitor helps far better to view the webpages (less scrolling) and also when using certain office programs. 90 degree monitor is especially helpful for programmers/game developers.

Idea is this:

bqp-programmer-duo-monitor-gw2790qt-vertical


looking at that, it appears the stand doesnt extend beyond the front of the monitor plane?
Well, since i have curved monitor, the monitor plane isn't straight when viewed from the side. And the feet do protrude a bit further out from the monitor plane (~5cm or so). But it won't get into the way of my usage.

EOL = "end of the line"?
Almost. EOL = End Of Life. Aka discontinued.
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
There is no M.2 drive in the world that utilizes 8x lanes (or 16x lanes). All what we currently have, are mainly utilizing 4x lanes. There are some 2x and 1x lane ones as well (e.g wi-fi modules).

original question: the PCI_E slots are all from the CPU, so I presume all can go at full speed simultaneously?

ok, I have noted this for future reference,


"Best" is vague term.

I left it vague, as I dont know what best is!

Since with M.2 drives, they have several different metrics and none of the drives have all metrics at highest, whereby the drive being "the best". Those metrics are: read speeds,

this is a fundamental criterion, as everything is written to be read,

write speeds,

this is important for backups where one probably wont ever use most backups, but they are there as an insurance policy. where the main activity is writing.

durability, reliability,

are you saying some on the market arent durable?
and some arent reliable?

eg there is a pub near Bristol which says "good food and real ales served", that suggests some places serve bad food and phoney ales!

when you say reliability, are you talking about where sometimes the drive gives an error, and sometimes not?

where durability presumably is about where it goes kaput, with permanent error from some point in time onwards?


heat generation,

looking at your later URLs, if it just requires passive cooling, then less important,

but if it requires fans, then we are back to the noise problem.

power consumption,

power less important per se, but is indirectly important if it leads to too much heat, and then the noise of fans,
also indirectly for mobile devices that the batteries will run out sooner, which then means bigger batteries which are then an encumberance.


capacity,

this is a complicated one, I'd go for a lower capacity to get a cheaper price,

and as you point out, some technologies have a capacity limit.

ultimately, for me, 2T is already very big, so I'd consider 4T more for say backing up two 2T's.

approx 1997, iomega did an interesting hard drive, with removable cartridges.

the thing is M.2 drives are a limited number of sockets, so they are semi-permanent, where its a performance to change them, like with SATA drives, so a higher capacity can mitigate for the hassle of changing them.

price (namely price to performance ratio aka value).

price I will disregard, as it isnt about the physical product but the procurement of it!


instead I will maximise the innate phenomena of the product, but then opt for a lower capacity if the price is too high.

I have been around long enough to know that they always skew pricing where the top end is very expensive, and the low end is very affordable.

so when the technology progresses, the earlier expensive stuff becomes affordable,

as technologies establish, you always get economies of scale, where its always more efficient to do a lot of something than a few of the something.

also the steps gradually get optimised,


The fastest read/write speed PCI-E 5.0 x4 drive currently is Crucial T705 (up to 14500 MB/s read and 12700 MB/s write speeds),
review: https://www.tomshardware.com/pc-components/ssds/crucial-t705-2tb-ssd-review
pcpp: https://uk.pcpartpicker.com/product...e-50-x4-nvme-solid-state-drive-ct4000t705ssd5
I read the full review, and it sounds very interesting, and it is passive cooling,

I wont make a decision right away, but I may go for the 4T, as 4T seems the proportionate capacity to go for.

at £615, I wont make a hasty decision, but will think over it as I work on the other problems.


At current moment, PCI-E 5.0 x4 drives are in capacity of: 1TB, 2TB and 4TB. There are no 8TB drives for PCI-E 5.0.
If you want 8TB drive, you have to look at PCI-E 4.0 x4 drive.

Best (fastest read/write) 8TB PCI-E 4.0 x4 drive is Corsair MP600 Pro XT (up to 7000 MB/s read and 6100 MB/s write speeds),
review: https://www.tomshardware.com/reviews/corsair-mp600-pro-xt-ssd-review-corsairs-best-just-leveled-up
pcpp: https://uk.pcpartpicker.com/product...4-nvme-solid-state-drive-cssd-f8000gbmp600pxt
if I got an 8TB PCI-e 4.0 x 4, I would probably not put that on the M2_1, as that can take PCIe 5.0 x 4,

I want to try and use sockets efficiently as regards data rates, where ideally the socket and the device are the same rate.

now if I ran out of sockets, I'd start using the reserved higher grade sockets for lower grade hardware!

now there does remain a further question:

the M.2 expander converts the PCI_E2 PCIe 5.0 x 8 into M.2 socket(s),
and you said there are no 8 lane M.2 drives,

does it convert PCI_E2 into two PCIe 5.0 x 4 M.2 sockets?

as that would enable up to three PCIe 5.0 x 4 M.2 drives for the mobo.

are those sockets then called M2_5 and M2_6?

If you go by the lowest price per GB, you have to give up on other aspects. Like read/write speeds and/or durability/reliability. Maybe even on heat/power aspect.

I wouldnt just go for outright lowest price per GB, but I would allow some blur, same with read and write speeds,

eg if one item is 7Gbps and the other is 7.2Gbps I wouldnt lose a lot of sleep if I went for the 7 Gbps, and would go for the 7Gbps if it was significantly cheaper.

6 or 7 or 8, its all much of a muchness, but 7 vs 14 is significant.


Value wise, value king would be Silicon Power UD90 PCI-E 4.0 x4 drives, both the 2TB and 4TB variants, whereby price per GB is £0.05.
review: https://www.tomshardware.com/reviews/silicon-power-ud90-ssd-review
pcpp: https://uk.pcpartpicker.com/products/compare/f4cG3C,2BvD4D/

In comparison, Corsair MP600 Pro XT 8TB drive has price per GB at £0.12.
And Crucial T705 4TB drive has price per GB at £0.154.
what are the negatives for the £0.05 ones?


Blu-ray discs are essentially obsolete, replaced by USB thumb drives.

maybe, but eg I have a dual recording 3D bluray TV box,

and the only way to "own" the recordings is via blurays, where I can put a lot of recordings onto one bluray. I literally have 100s of blurays recorded.

basically you can record 2 TV programs at the same time to the hard disk,

then later you can shunt the recordings to blurays, I use BD-Rs, because these are 80 pence each including postage. 80p for 25Gig.

I now privately own the recording. commercially I dont own it, but I do for my own private use.

I can also record to a USB hard drive, but it uses a proprietory formatting of the drive, where I can only play those recordings from that specific drive and specific recorder. I cannot copy those to another drive, so I dont own the recording, it is tied to the machine.

some recordings only allow I think one copy (or it could be 2) where with each copy, they have a counter, so you cant copy the copy. this is usually where the program is a proper production. but I can play that copy on any machine.

with many tv boxes, recordings are to the internal hard drive only, and you cant get them out of the box, so you dont own them. eventually the drive is full, and you have to ongoingly delete to free up space.

this is a problem with the modern trend of the streaming of films, you no longer own the film, but only have a license to view it. eg I know someone with an Apple box, which has virtually every film ever. he demonstrated it with the film the stepford wives, the remake with nicole Kidman, not the original version from the 1970s. there were 2 options, a lower price to view just once. and a higher price to be able to view repeatedly. I dont like this form of extortion.

DVDs now cost about £1, mostly refurbished, and so you can own films dirt cheap. DVDs take a lot of space, but the thing to do is discard the case, and just keep the inlay card, then much less space.

his Apple box doesnt have everything, it doesnt have the "meet the applegates" film!


4x layer Blu-ray disc can hold up to 128 GB. At current date, that's peanuts.

it does depend, eg its more storage than my PCs memory!


nonetheless it is useful say if you wanted to save every dashcam recording of every car journey you ever do!

now it all depends on the price per GB how that compares to your flash drives,

eg with BDRs, the price is 80 pence for 25GB, that is 3.2 pence per GB, which is cheaper than your earlier
Silicon Power UD90
which is 5p per GB.

and you can have an unlimited amount of BDRs, whereas the M.2 drive is a hassle to change. now that M.2 drive is equivalent to 160 BDRs, so spacewise is much better, and 3.2pence versus 5pence is much of a muchness,

but the big problem in this case is the hassle of going beyond the 160BDRs. I literally have hundreds of BDRs, where each might have 15 TV programs.

Especially since one can buy cheap USB thumb drive that holds 256 GB or 512 GB. Biggest capacity thumb drives currently are up to 2 TB in size.
the million dollar question is what price per GB for that 2TB drive?

I bought a tiny 64gig USB3 C thumbdrive in a supermarket today, for maybe installing Windows 11 from, but it cost me some £14, that is 21.8 pence per GB, a bluray is 3.2 pence, the bluray is 6.8x cheaper! I agree that the thumbdrive is hands down better in terms of size. but 21.8 pence versus 3.2 pence is significant. 5 pence versus 3.2 pence isnt significant for me.

With Blu-ray disc, you need dedicated hardware to read and write them (ODD) + the blank disc as well,

um, you might not have noticed, but you need a hulking PC and monitor to write to thumbdrives also! you need a PC with a mobo and a USB controller, and a chipset, a monitor etc. whereas with optical drives, you can just have a hifi system, where the technology is better encapsulated into the drive.
where you dont even need a monitor.


what I find interesting about blurays, is there is no electronics in the storage! its just some perspex with a strange surface, no transistors or resistors, etc.


to write data on. Moreover, any data written to Blu-ray disc is read-only and can not be changed.

false!

even with CDs, you had CDRW's where you can read and write and delete, and re-record, etc.

with DVDs also, you had DVDRW,

with BDs, you have BD RE, where RE is french for something like re-enscribable,

with this new drive, it has re-recordable three-layer, TL RE, looks like for quad layer it is write-once.

with BDs, I eventually just used BD-R's, because these are faster and cheaper than the re-recordable ones.

I bought a box of 10 Sony TL RE's for £82.52, direct from Japan, not arrived yet, which is £8.25 per 100 Gigabyte of re-recordable data. which is 8.25 pence per GB. just to experiment with the technology, to see what its about.

I have actually programmed writing to CDRW on the Amiga via SCSI 3 to a Yamaha CDRW drive, approx 1998.

the optical drives have a bit of inertia before you can access them.

but that is probably because they only spin on demand, whereas hard drives spin continually even if not accessed.

But you can add more to it when it has free space left. Still, burning the disc takes quite a bit of time and IF the burn should fail, data written will be corrupt, without 0 chance to rectify it.

for the CDR, DVD-R and BDR, I agree,

optical drives have some inertia, and can be noisy when recording, as they literally have to burn the disk.

in practise, I have had virtually no fails. so I think its a scare tactic to emphasise that problem.

the main time I ran into jeopardy was when a recording of 2 TV programs began, when the machine cannot record to bluray at the same time.

so you need to check the recording schedule, and only record to bluray when no TV recordings imminent.

normally, I just select say up to 19 programs on the machine's hard disk recordings, select various confirm buttons, and then it burns away, making moderate noise, and many minutes later it is complete. I then have to finalise the disk, although unfinalised, it works on players by the same manufacturer.

this is with BD-R.

Not a reliable concept if you ask me. Moreover, blu-ray disc itself is quite big (dimensions wise) and usually is needed to keep it in secure container, so that the disc doesn't get damaged.
yes, the size is a minus, in terms of damage, the perspex is pretty strong, you should of course store it carefully. we had a CD sampler of Baroque music, which we played a lot because it was so good.
eventually the disk became unplayable. I then tried something, I washed it with dishwashing detergent and a lint free cloth, and now it played perfectly again! it had just become dirty. I did an iso copy just in case.

if you try to damage a CD or bluray, in order to discard it, you will struggle a bit!

vinyl disks were much more susceptible to damage.

I always avoid touching the recording surface of any recorded medium, whether vinyl, cassette, or optical.

with vinyl disks, the recording starts at the outer and spirals to the middle.

with optical disks, its the other way round from vinyl, it starts at the centre and spirals outwards! the idea is the edge is the most vulnerable to mishandling and damage, so the data starts at the middle.

I learnt this from when I was programming the CDRW, where I contacted a guy online who had written a book on SCSI, who then forwarded me to a guy on the standards committee for optical media. he then told me some crazy stuff about how optical media work.

USB thumb drives, in the other hand, are like standard storage drives, but in FAR smaller dimensions. Also, any PC that has USB port (usually type-A), can access and use USB thumb drive, without issues.

the size is definitely an advantage, but I am unconvinced of the cost advantage versus write-once optical media, eg BDR. even approx 1998, a CDRW of some 650MB, was vastly cheaper than a hard drive of that capacity.

its because your thumbdrive, probably needs a flip flop for every binary digit, basically it needs an electronic circuit for every digit, whereas an optical disk is just dumb material, its just some perspex with a coating.

now this does mean you need a machine to read and write the coating,


Best part, if the data written to USB drive should become corrupt, you can erase it and write again. Or even format entire drive and start anew. Far reliable concept if you ask me.
not an advantage, as optical media can also do this,

you need to get the right kind of optical drive, eg CDRW, DVD+RW, DVD-RW, BDRE, TL RE look for the letters RW for earlier era, and RE for later era.

these disks are doubly impressive, firstly that they are optical technology, which is underexplored, and secondly that they are a dumb physical medium which is difficult to price match for electronics per digit. it has taken ages for SSDs to outdo dumb magnetic material magnetic hard drives.

Especially when compared to CD/DVD/Blu-ray discs.

not the CDRW, DVDRW, BDRE, TL RE,

Current best USB thumb drives,
article: https://www.tomshardware.com/best-picks/best-flash-drives
E.g Buffalo SSD-PUT 2TB, amazon UK: https://www.amazon.co.uk/dp/B09XS1972J/
that one is £159.99 for 2T, which is 8 pence per Gig, its not as cheap as BDR which is 3.2 pence per Gig!

I agree it may have other advantages, but if you have plenty of storage area, and a ton of data to store, then BDRs are very cheap, and very durable.


But to answer your question regarding burning software;
I've always used Nero Burning ROM do burn my CD/DVD discs. Now, i haven't burned anything in years (despite having many blank discs and ODD as well) since they are obsolete.

they arent obsolete, they are the only way to own say HD and 3D TV recordings. and its in fact difficult to find drives which will record this, as they dont want people recording such things.

BDRs are also the cheapest way to store a lot of data that you are unlikely to revisit. but they do eat up storage space.


I have 128 GB USB thumb drive for that. Only minor use is to burn audio CD that i can listen in my car. But i haven't bothered. I already have several good audio album CDs (legit ones) that i listen to.
Nero Burning ROM, wiki: https://en.wikipedia.org/wiki/Nero_Burning_ROM


Why burn the *.iso to DVD? Why not use USB thumb drive instead?

BECAUSE, 80 pence for the bluray. the thumbdrive I bought today was £14,


In-depth guide on how to install Win11 here: https://forums.tomshardware.com/threads/windows-11-clean-install-tutorial.3831442/


Monitor/TV resolutions can't be linear by the power of two.
you misunderstood what I meant!

what I meant is 4K means 4x the 2D pixel count,

but by linear I mean the linear factor of pixels:

1920 x 1080, multiply by 2 in both directions:

1920 x 2 = 3840
1080 x 2 = 2160

thus 3840 x 2160 is 2x the linear dimension,
and thus 4x the pixel count.

similarly 720p = 1280 x 720,

1920 = 1280 x 1.5
1080 = 720 x 1.5

thus full HD, 1920 x 1080 is 1.5x the linear dimension, and 2.25x the pixel count.

put another way 3840 x 2160 = (2 x 1920) x (2 x 1080).

the problematic thing is when the linear factor is non integer as this leads to distorted font rendering if you rescale.


Instead, they follow the aspect ratio of 16:9 (1.78:1).
Further reading: https://en.wikipedia.org/wiki/16:9_aspect_ratio

Before the widescreen (16:9), the square-ish aspect ratio was common, namely 4:3 (e.g CRT monitors).

372px-Aspect_Ratio_Chart.svg.png

its an impossible question as to what aspect to go for!

but for computers, its best if the pixels are square.

ultimately full HD is the first commercial one which is all round satisfactory, enough pixels for fonts to be nice,

720p is a bit inadequate.

and because vision is biassed to left-right, rather than up-down,

as the 2 eyes are placed left to right,

a wider screen is more ergonomic for films, as it spreads the visual load between the 2 eyes, and mimics the view we have of the world as humans, where the world visually is a landscape zone.

now some animals have their eyes on the side of their head, where their view of the world is very different.

cats and owls have the eyes set right at the front, cats generally have poor peripheral vision, where they dont notice you if you approach from an angle.




There are differences between 1080p and 1080i, namely:
....
The "i" is an abbreviation for "interlaced"; this indicates that only the even lines of each frame, then only the odd lines, are drawn alternately,

that makes a lot of sense, with the Amiga 500, you could have a higher res screen via interlaced, but it did lead to flicker for fine fonts.

maybe rather than interlaced, they should have just doubled the resolution and gone for 25Hz?


Road doesn't need to have low curvature for one to drive fast. Instead, curvature can be quite decent, but what stops going straight, is when you rise the road bank angle. Up to 90 degrees and it has been made on several places. Namely, vehicle test tracks have 90 degree high-bank curves.

E.g one in Germany, Mercedes-Benz test track:

einfahrbahn-3-w1920xh1080-cutout.jpg
ok, that's a crazy idea!

but for normal roads, the curvature reduces as the speed increases!

if the road slopes up then yes, that would mitigate the problem but it does bring in a new problem that you'd have to keep at that high speed otherwise the vehicle might drop off the curve!

with that bus, there is also a risk that it rolls downwards!



Source + further reading about that test track: https://group.mercedes-benz.com/com...bahn-untertuerkheimspecial-topic-article.html

Even some high-speed trains tilt themselves when they go around the curve, so that they can maintain the higher speed.
Further reading: https://en.wikipedia.org/wiki/Tilting_train

this is a new idea for me!

I had seen photos of those trains tilting, but I had never thought anything further about it,


but as mentioned, its not for the general public, as there is major potential danger, needs a skilled operator.
https://en.wikipedia.org/wiki/Tilting_train
Not only that, but 90 degree rotated monitor helps far better to view the webpages (less scrolling) and also when using certain office programs. 90 degree monitor is especially helpful for programmers/game developers.

Idea is this:

bqp-programmer-duo-monitor-gw2790qt-vertical
that could be useful, but you could also put the program in 2 vertical columns on a wide screen or even 3 columns with say a 31.5" screen.

when I program I just put the constrained space to good use, to rethink the code to work in the available space.

if it starts to take up too much space, I might split the activity into 2 subroutines, where each would fit.

its always nice when a subroutine fits in the visible space without needing to scroll.

but its also nice when most subroutines arent too big.

I generally only allow large subroutines when there is no other option, some things have to be done with a long subroutine.

thus if there is a long subroutine which doesnt fit with my own programs, I know there is something tricky with this part of the program.

you realise that most of the innovative ideas of programming emerged from the earlier eras? its because the systems were so constrained, they had to be really innovative to mitigate. whereas today computers are so unconstrained that people use brute force and inefficient programming.

but in the old days, if you programmed inefficiently, the program just wouldnt work

cpus were slow, memory was small, disk size was small, memory and disks were slow,

in 1977, people were still using punched cards! Clarke's shoeshop in town used punched cards for their products!

in 1978 I was given a tour of the university computer by a colleague of my dad, and it used punched tape!

by 1980, punched cards were obsolete.


eg things like data compression are to mitigate lack of storage space. eg with the Amiga, the system disk was 880K, and you wanted to boot from just one disk, so you'd compress stuff to fit more on that disk.


Well, since i have curved monitor, the monitor plane isn't straight when viewed from the side. And the feet do protrude a bit further out from the monitor plane (~5cm or so). But it won't get into the way of my usage.
the stand is the mechanical level of the engineering!

some things are just better designed,
there is even a mechanical aspect with the electronics, eg where to put the different circuitry on the mobo!

with a socket, which wire should be on the left, then which wire, etc!

Almost. EOL = End Of Life. Aka discontinued.

could have many different meanings,

ultimately in this case probably unavailable. the thing is you probably continue to use it! so it still is there, just maybe not for new users.

I have a lot of really useful obsolete stuff!
eg I bought a JVC VHS recorder right at the end of the VHS era, and some videos only exist as VHS, where the only way to watch it is with a VHS player.

I think the film "meet the applegates" only legally exists as VHS.

I bought a sextant also, which enables you to measure the angle between distant objects:

http://www.directemails.info/tom/sextant.jpg
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
I have gotten into a strange problem:

I went to install the new M.2 drive, and then the cover plate wouldnt close!

scrutinising, this is the 980 Pro, and appears to have its own heatsink which obstructs the cover,

and the other one is a 990 Pro.

checking my amazon account, this time I did order a 980 and last time a 990.

then checking my initial post, the link says 990 Pro,
but if you click it, it says 980 Pro!

I was wondering if I could remove the heatsink, but it seems to
be attached with proprietory nonstandard screws.

the 990 I originally bought was this:

https://www.amazon.co.uk/dp/B0B9C4DKKG

I have removed the 980 and closed the cover plate with just the 990 at M2_3,

amazon say I have a refund facility, where I selected the preset option "Incompatible or not useful for intended purpose", saying the mobo cover plate wouldnt close and that I thought I was buying a 990 Pro. their refund process is I dont have to put it in a parcel, I just take it to an amazon counter at a post office and show a QR code, and they do the rest.

this is the problem with things that dont have a proper naming scheme, where the naming is unmemorable, danger of error!
 

Aeacus

Titan
Ambassador
are you saying some on the market arent durable?
and some arent reliable?
Yes.

Cheaper components used (corner cutting), leads cheaper price of the product but product then is also less durable and less reliable.

when you say reliability, are you talking about where sometimes the drive gives an error, and sometimes not?

where durability presumably is about where it goes kaput, with permanent error from some point in time onwards?
Durability can be considered a measure of the most likely maximum normal use of a product before it reaches its end of life, while reliability is directly related to the probability of product failure given normal environmental and operating conditions.

does it convert PCI_E2 into two PCIe 5.0 x 4 M.2 sockets?
Yes.

Once you remove the heatsink with fan on the M.2 Xpander-Z Gen5 Dual card, you'll see two PCI-E 5.0 x4 slots for 2x M.2 SSDs.

mb-20220927-6.jpg


as that would enable up to three PCIe 5.0 x 4 M.2 drives for the mobo.
Yes.

are those sockets then called M2_5 and M2_6?
Not officially. You can call them as such but MoBo manual doesn't call the two slots on expansion card individually. Instead, MoBo manual (pages 40-43) refers it as a whole.

what are the negatives for the £0.05 ones?
You did read the review, right?
If you only need 1TB of capacity, this is a relatively inexpensive PCIe 4.0 drive that would work great in a Playstation 5 or desktop PC. Thanks to the DRAM-less controller and 176-layer TLC flash, it’s also power efficient enough to work great with laptops.

The UD90 comes in three capacities of 250GB, 500GB, and 1TB. Silicon Power has informed us that it intends to launch with just the 1TB SKU in the U.S. The smaller SKUs are for other regions and markets for now. This drive reaches peak performance at 1TB, and that capacity tends to be the “sweet spot” for mid-range drives like this, so this is not too terrible. Some users prefer smaller drives for the OS in multi-drive builds, but it’s hard to get the most out of a fast PCIe 4.0 drive without a higher capacity for more flash and interleaving.

We use the DiskBench storage benchmarking tool to test file transfer performance with a custom, 50GB dataset. We copy 31,227 files of various types, such as pictures, PDFs, and videos to a new folder and then follow-up with a reading test of a newly-written 6.5GB zip file.

Silicon Power UD90

(Image credit: Tom's Hardware)

Silicon Power UD90

(Image credit: Tom's Hardware)

The UD90’s first misstep is here, but it’s not a big one and not unexpected. DiskBench results are limited by a drive’s bandwidth potential, and as such, the lower-end drives like the SN770, FX900, P400, and UD90, fall behind on reads and copies. The UD90 still remains pretty close to its direct rivals. If you absolutely need the fastest file transfers, you should be looking at a high-end PCIe 4.0 drive.

Sequential results in CrystalDiskMark are limited by the interface or controller bandwidth, as determined by its channel count and bus speed, leaving the UD90 in the lower tiers. However, it’s still faster than any PCIe 3.0 drive.

The UD90 writes in its fastest, pseudo-SLC state at over 4.6 GBps for almost 15 seconds. This implies a dynamic cache of around 69GB which, although presumably shrinking with drive usage, is sufficient to absorb random and smaller sequential writes. The UD90 then drops down to around 1.8 GBps for another eight minutes of writes. Then, finally, it hits its slowest state at about 275 MBps.

hcHujK2zSjpM2czUBuNfRQ-970-80.png


We can reasonably compare this to its peers and see that it has a significantly smaller SLC cache, a faster middle state, and a very slow worst-case scenario when SLC must be emptied.

Most users will not see the worst a drive has to offer, but dynamic cache inevitably shrinks with drive usage, and sustained writes will eventually slow down the drive. The UD90 offers a nice, balanced approach, with higher speeds in the middle state than we see on E12 and E16 SSDs, although in practice what we see with the FX900 and P400 might offer a slightly better user experience.

The UD90 did not recover its SLC cache quickly, instead bouncing back to its middle state, which is still plenty quick. This drive can better handle bursty writes, especially random ones, as befits normal consumer usage. You'll need to jump to a high-end model if you want faster performance in sustained write workloads.

The Silicon Power UD90 is yet another winner in a stream of affordable, DRAM-less SSDs that manage to exceed expectations. Improvements to controller design and flash have allowed manufacturers to offer efficient, powerful drives at a reasonable price point. Of course, these aren’t the fastest drives — they don’t saturate your PCIe 4.0 connection, and in everyday use, they might not be a huge upgrade over older PCIe 3.0 drives that had DRAM.
One thing the review can't test, is reliability. And durability testing too, needs more time.
So, it is cheap drive that can't perform at the full level of what PCI-E 4.0 can offer. And when write is sustained for longer than 8 mins, it's write speed plummets to SATA2 level.

um, you might not have noticed, but you need a hulking PC and monitor to write to thumbdrives also! you need a PC with a mobo and a USB controller, and a chipset, a monitor etc. whereas with optical drives, you can just have a hifi system, where the technology is better encapsulated into the drive.
How many people, at current date, have Hi-fi system in their home? Compared to having either desktop or laptop PC?

Also, with Hi-fi system, it only has one purpose - audio reproduction. While purpose of PC is unlimited. Everything that hi-fi system can do - so can PC. But can hi-fi system do anything that PC can? No.

Thing with discs is, that most modern PCs (desktops and laptops), doesn't have ODD in them anymore. And while you can add external ODD to a PC to use the discs, it still is separate device just to use discs. While with USB thumb drives, USB ports are already built-in to all PCs. USB ports are very versatile, can accept all kinds of different devices, including external ODD.

even with CDs, you had CDRW's where you can read and write and delete, and re-record, etc.
BD-RE discs can sustain up to 10000 rewrites, while USB thumb drives (depending on a drive) can sustain between 10000 and 100000 rewrites. So, USB thumb drive, compared to BD-RE disc, either has equal amount of rewrites or up to 10x times more.

BDXL discs that are 128GB, are all BD-R.
Example: https://www.amazon.com/Digital-Printable-Blu-ray-Recordable-archival/dp/B0CDKJDLFF/
$9,4 USD per 1 disc.

Highest capacity BD-RE disc i found is 100GB.
Example: https://www.amazon.com/Blu-ray-Sony-Video-3BNE3VEPS2-Double/dp/B087MT3NWL
$15,3 USD per 1 disc.

To match the 100000 rewrites of USB thumb drive, you need to have 10x BD-RE discs, which are 100 GB per disc (or 1000 GB in total). Price wise, that's $153 USD for 10x discs at total capacity of 1000 GB.

Best USB thumb drive, Buffalo SSD-PUT at 2TB (2048 GB) costs $139,
amazon: https://www.amazon.com/BUFFALO-External-SSD-USB-‎‎SSD-PUT1-0U3B/dp/B09XS1972J/

So, with less money ($153 vs $139 USD), you'll get twice the capacity compared to BD-RE discs. Not to mention considerably smaller dimensions, doesn't need additional hardware to use the USB thumb drive (compared to need of ODD for BD-RE) and much faster read/write speeds (616.9 MB/s sequential read speed and 543.9 MB/s sequential writes).

With this, i don't see how BD-RE has better value, when also factoring in the rewrite amount + need for ODD + taking up much more space + slower read/write speeds + noise of ODD as well.

the million dollar question is what price per GB for that 2TB drive?
Buffalo SSD-PUT 2TB costs $139,99. It being 2048 GB drive, will bring price per GB to $0,068 USD.

Which makes (after currency conversion) £0.055 or 5,5 pence per GB. Considerably cheaper than your BD-RE disc price per GB.
I bought a box of 10 Sony TL RE's for £82.52, direct from Japan, not arrived yet, which is £8.25 per 100 Gigabyte of re-recordable data. which is 8.25 pence per GB. just to experiment with the technology, to see what its about.

I went to install the new M.2 drive, and then the cover plate wouldnt close!
When M.2 drive comes with it's own heatsink already installed, you don't need to use MoBo M.2 cover, since in general, it does the same thing as dedicated heatsink - albeit it is worse in that. So, whenever possible, use dedicated heatsinks (when needed) with M.2 drives, rather than relying on the M.2 cover that comes with MoBo.
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
Just saw the news that SSD prices aren't going to drop. Instead, they are in the rising trend.
Article: https://www.pcgamer.com/hardware/st...h-demand-and-supply-chain-challenges-wd-says/
nonetheless, my current plan is to wait and see,

BECAUSE,

although SSD prices may be rising, and there is inflation, especially from scarcity of raw materials and pent up demand,

that is the trend for the main market,

but the top end of any technology is a different dynamic, where they hike the prices to recoup their R&D costs,

with time, an improved or an alternative technology moves to the top spot, and prices of the earlier technology will drop, this downtrend is faster than the ambient uptrend,

eventually the prices could rise.

so the price graph of the top end will move down, stabilise, and then move upwards with the market.

the article said the entire market is moving up, now the AI demand I think will go for the max capacity ones, this is where if I want a 4T PCIe 5.0 x 4, it may be better to wait till 8T is the top end.

the thing is AI is overhyped, and a lot of firms will go bankrupt by overrelying on AI, where AI will push through crazy decisions. the stampede for AI is greed, where everyone is assuming everyone else is going for AI, which is a classic bubble. at some point the bubble will burst. Google has a problem by using AI, that a lot of their money is from the Google adverts at websites. but if Google gives an AI answer to a question, people dont visit the websites, and Google makes no money! Google will no longer be a search engine but an answer engine. their only way forwards will be a paywall, but 99.999% of people out there dont have any money, and will just stop using Google, Elon Musk is learning this with Twitter, where it was only popular because free. put a paywall and people will go elsewhere. Google will go the way of Twitter if they bring in a paywall. I think Google are already sacking people.

in the old days, you'd search on Google, get a lot of URLs, and study the ones at the top, thereby finding interesting sites, and learning unexpected things. but today with their AI answers, I visit much fewer websites, but Google is all derived from those websites. so this will lead to trouble for Google. The Google AI answers are highly unreliable, you cant just blindly trust anything on the internet, but the Google AI does just that, it blindly believes everything on the internet and goes for the consensus view.


with the PCIe 5.0 x 4 drives, the top end looks like 4T,

the top end is likely to become 8T,

where that Crucial T705 is impressive technology, another firm will come up with a different impressive technology, to some extent the manufacturers are gaming the system.

they could release some technology way beyond today's, but they then miss out on a lot of profits, so what they do is edge the technology forwards by a small step, and they make a new stream of profits from that, until the market dries up, then they edge forwards another small step! they hold back their technology. eg in 2011 at the IFA show, I saw a Philips "3D without glasses" technology, which was like a hologram in a scifi film, using some kind of hologram technology. I asked how much it cost, the Philips exec said: this is not available to the public. they had showcased it basically because I think Toshiba were showcasing their 3D without glasses technology at the same show, which was much more basic, a bit like those 3D postcards. They made the technology public out of panic that Toshiba might outdo them.

the thing is, now 13 years later, that Philips technology still hasnt been released!

the Toshiba technology was used for a few mobile phones, but that all got eclipsed by the first iphone! which was a different technological development.

years ago, I heard that with Apple they have future versions of the iphone already designed, but they delay releasing these.

there is always a premium for top end, and in general its best to avoid the topmost end of technology, as you get something better and cheaper by waiting. topmost end technology usually becomes junk.

now maybe this time will be different, but I work with how things usually are, its always a gamble, and also I dont want to wait too long, but I am prepared to wait some months as a "wait and see" MO.

that article says various firms might go bankrupt, but then the SSDs become bankrupt stock, and the prices will drop. in a recession you can get deflation, where bankrupt stock is much cheaper, solvent firms then have to drop their prices, and then they go bankrupt, etc!

approx 1978, one of my dad's colleagues got one of the early lcd watches, and it was an expensive luxury, I think a Seiko, but could be a Citizen, advertised in Newsweek magazine like the Rolexes, a year later my Dad bought me a Casio with 5 alarms, countdown, stopwatch, at a fraction of the price, that earlier watch I think just gave the time, no other functions!

when I bought my SLR camera in 2012, it was some £550, 2 years later I saw it in town for some £250.

with EVs, originally it was mainly Tesla, and theirs were something like £45000, which was a total false economy, as I could drive on diesel till my car was fully obsolete, and still be nowhere near 45000 total cost.

eventually I saw a non Tesla EV ad on TV, a Renault Zoe, then after some weeks another EV, possibly a VW, and now there are so many EVs, eg the former Tesla showroom at a Bristol mall became a Polestar EV showroom. and at some point Tesla dropped their prices to £20000!

now I plan to buy a second hand car, and even when the Zoe and VW etc appeared, there was no second hand market yet. you need a year before the 2nd hand market begins, with the 1 year old cars, but you need 3 years before you have 1, 2, and 3 year old cars.

also by waiting, the technology improves, prices drop,

my plan is to get a 2nd hand hybrid, that way I can continue using fuel, the EV aspect is just so I can drive anywhere, right now there are clean air zones in the centres of many cities, eg London's ULEZ, Bristol, Birmingham, Oxford, etc

and you have to pay maybe £12 a day for any day you drive in that zone, enforced by ANPR (automatic number plate recognition) cameras,

anyway, each month I wait, the 2nd hand market improves, my initial plan is to get a 2 year old 2nd hand hybrid car, but if prices too high, I might go for 3 year.

the moment you buy a new car in the UK, the price drops 20% because of the VAT tax on new cars. with a new Tesla, the price drops by £3333.33 the day after you buy it!
 

Aeacus

Titan
Ambassador
Your call if to use; newest, current or yesteryear tech.

When i need something, i look what is currently available and get that. Rather than wait for better deals. Since one can remain waiting for forever.

the thing is AI is overhyped, and a lot of firms will go bankrupt by overrelying on AI, where AI will push through crazy decisions. the stampede for AI is greed, where everyone is assuming everyone else is going for AI, which is a classic bubble. at some point the bubble will burst. Google has a problem by using AI, that a lot of their money is from the Google adverts at websites. but if Google gives an AI answer to a question, people dont visit the websites, and Google makes no money! Google will no longer be a search engine but an answer engine. their only way forwards will be a paywall, but 99.999% of people out there dont have any money, and will just stop using Google, Elon Musk is learning this with Twitter, where it was only popular because free. put a paywall and people will go elsewhere. Google will go the way of Twitter if they bring in a paywall. I think Google are already sacking people.
This reminds me the fiasco Amazon had with their "AI powered" instant check-out system.
Article: https://arstechnica.com/gadgets/202...e-checkout-which-needed-1000-video-reviewers/

Whereby:
"AI" checkout was actually powered by 1,000 human video reviewers in India.
:rofl:

As far as AI goes, it's another step in Technological Singularity,
wiki: https://en.wikipedia.org/wiki/Technological_singularity

1st we had PCs, that made several calculations much faster than humans, thus, dedicated people needed to calculate stuff, lost their job.

Comptometer-Calculating-division-1914-Brit-Heritage.jpg


2nd step was robotics (automation), where line workers in factories were made essentially obsolete.

assembly-solutions-carousel-1.tmb-indcapcrsl.jpg


Now, 3rd step is AI, that makes the thinking for humans.
E.g AI art that won an art prize:

merlin_212276709_3104aef5-3dc4-4288-bb44-9e5624db0b37-superJumbo.jpg

Article: https://www.nytimes.com/2022/09/02/technology/ai-artificial-intelligence-artists.html

And of course, AI usage has become cheaper, faster and better quality than that of a humans. E.g article where game developer payed AI artist for the artwork, rather than hiring artists to do the drawings,
article: https://www.pcgamer.com/games/card-games/champions-tcg-ai-artist/
 

Richard1234

Distinguished
Aug 18, 2016
273
5
18,685
original question:
are you saying some on the market arent durable?
and some arent reliable?
Yes.

Cheaper components used (corner cutting), leads cheaper price of the product but product then is also less durable and less reliable.

I would like to see some specific examples,

I know with USB magnetic drives, I had one by I think Verbatim, which soon stopped functioning. when I went to the shop, they said I have to contact the manufacturer, I didnt because I would need to post it in, and they could trawl through too much of my data!


Durability can be considered a measure of the most likely maximum normal use of a product before it reaches its end of life, while reliability is directly related to the probability of product failure given normal environmental and operating conditions.

original question: does it convert PCI_E2 into two PCIe 5.0 x 4 M.2 sockets?

that is cool! so it means I can have three PCIe 5.0 x 4 M.2 drives with this machine, and all via the CPU, where presumably no bandwidth contesting.


Once you remove the heatsink with fan on the M.2 Xpander-Z Gen5 Dual card, you'll see two PCI-E 5.0 x4 slots for 2x M.2 SSDs.

mb-20220927-6.jpg
I havent opened it, as I only open stuff at installation time,

with the graphics card, the box is still factory sealed, I will only open it in the same session I try to install it.


Yes.


Not officially. You can call them as such but MoBo manual doesn't call the two slots on expansion card individually. Instead, MoBo manual (pages 40-43) refers it as a whole.


You did read the review, right?
I read the other review, didnt read this review as its for PCIe 4.0 where I had accepted the price of the higher end Samsung PCIe 4.0 !

basically I ordinarily wont read stuff about things of a lower grade than what I intend to buy.

ie I mainly dont read stuff just to know about things, but I read info either to act upon, or because it is interesting.

now I might read some stuff just to know, but this might be say with science, with technology and modern things its not possible to know everything, so you have to be selective.

but in the 1990s and earlier, there was much less info around, and one could read around for the sake of it,
mainly because nothing else to do. in those days, I'd buy an amiga magazine, where these were monthly, eg Amiga Format and Amiga User International, and because of a lack of things to do, I'd eventually have read every article!

there was nothing on TV, for books, there would just be a few of interest in a bookshop, and usually too expensive to buy.

One thing the review can't test, is reliability. And durability testing too, needs more time.
So, it is cheap drive that can't perform at the full level of what PCI-E 4.0 can offer. And when write is sustained for longer than 8 mins, it's write speed plummets to SATA2 level.

How many people, at current date, have Hi-fi system in their home? Compared to having either desktop or laptop PC?

Also, with Hi-fi system, it only has one purpose - audio reproduction. While purpose of PC is unlimited. Everything that hi-fi system can do - so can PC. But can hi-fi system do anything that PC can? No.
but we have TV boxes, and usually you can only record to the internal drive.

but my one you can transfer from internal drive to blurays, both BD-RE and BD-R.

no computer needed.

you can also transfer to a USB hard drive, but it uses a proprietory format,

which I think may mean you cant transfer to a thumb drive, as those seem to have their filesystem on the disk itself, this is why that fake 64T drive was so elusive, as it did the filesystem internally faking 64T.

thumb drives tend to be FAT only, maybe today the technology has moved on, but the ones in the past were FAT only.

the mobo has a thumbdrive, usb3, and its not like a normal drive, I wanted to make a backup, and cannot see how to do that. I will try with Linux Mint later.



Thing with discs is, that most modern PCs (desktops and laptops), doesn't have ODD in them anymore. And while you can add external ODD to a PC to use the discs, it still is separate device just to use discs. While with USB thumb drives, USB ports are already built-in to all PCs. USB ports are very versatile, can accept all kinds of different devices, including external ODD.


BD-RE discs can sustain up to 10000 rewrites, while USB thumb drives (depending on a drive) can sustain between 10000 and 100000 rewrites. So, USB thumb drive, compared to BD-RE disc, either has equal amount of rewrites or up to 10x times more.
this is academical, there is absolutely no way you'll get anywhere close to 10000 with either!

in general we just add to drives. we might ongoingly edit say a letter, but how many times is that? probably less than 100x.

blurays generally are for archiving, where eg a disk is full, so you have to either delete or shunt data somewhere. blurays are ideal for this, and in fact I use BD-Rs which are 1x rewrites, and it is just fine!

eg I make iso file copies of installation disks, and that kind of directory is ideal for archiving to free up space.



BDXL discs that are 128GB, are all BD-R. Example: https://www.amazon.com/Digital-Printable-Blu-ray-Recordable-archival/dp/B0CDKJDLFF/
$9,4 USD per 1 disc.
yes, the current technology seems to be write once only for 128G, eg this new drive is just write once for those, and is rewritable for the 100G's. I scrutinised that before purchasing, to see which things were recordable, and which were R and which were RW=RE.

the fact it is write once only isnt as big a problem as you seem to want,

I agree its a problem for say temporary files,

but other than the OS partition, most directories are either archival eg camera photos, downloads of purchases and documents, or are work in progress eg emails, letters, enhancing of photos usually start and finish in the same session, where the speed isnt so important.

optics are ultimately superior to electronics, as optics is via light which is super light super fast, 3 x 10^8 m/s, whereas electrons are much heavier, and slower, 2.7 x 10^8 m/s, and generate a lot of collateral problems eg heat and emr. light is super lightweight, it has minimal collateral damage.

as you pointed out yourself, the HDMI optical cables can go for 100s of kilometres without problem, whereas electric cables run out of steam after 15metres, and SATA a feeble 1m!

main thing is less money has been thrown at optics, because the main firms are wedded to electronics,
so are stuck in their ever deeper ditch, like happens with most things.

I think it is primarily Sony who are pushing optics, with Philips to some extent. and the japanese firms generally make the best optical drives.


when you look at the raw physics, optics is better.

even the electronic integrated circuits have to be made using optical technology, where they project a photo onto the silicon to generate the circuits.

remember that optical disks are just one line of innovation, 1985 is when I first was aware of CDs,

I find it impressive that costwise they still outdo electronics,

also impressive is how magnetic drives outdid electronics for so long.

in fact the old debate was when and whether optics would outdo magnetic.

what happened with magnetic could be replicated with optics, by using harder encased optical disks,
optics would then leap even further ahead.

I always thought it was a bad idea for optical disks to be unencased, the 3.5" floppy disks much better idea.

optics needs new lines of innovation, they are doing some eg with quad layer giving 128GB.

photons are much smaller stuff than electrons, a photon of light will say shunt an outer electron temporarily to an even outer electron shell.

I dont know if they have found a way to do logic circuits with light, but light does relate to semiconductors eg leds and photosensors eg with solar are light based electronics.

I had a fibre optic internet long ago, which was originally the blueyonder firm from the 1990s, and it was vastly faster than the electrical based other firms.


Highest capacity BD-RE disc i found is 100GB.
Example: https://www.amazon.com/Blu-ray-Sony-Video-3BNE3VEPS2-Double/dp/B087MT3NWL
$15,3 USD per 1 disc.
I got Sony 100GBs at some 8.25 per disk on ebay for 10, from Japan, its a japanese firm, you need to buy it from Japan!


To match the 100000 rewrites of USB thumb drive, you need to have 10x BD-RE discs, which are 100 GB per disc (or 1000 GB in total). Price wise, that's $153 USD for 10x discs at total capacity of 1000 GB.
its a meaningless comparison, as both 100000 and 10000 are way beyond what one will do in real life.

files on a disk are eg:

1. program installations, dont change much. install and forget.
2. work projects, eg photos, documents and other text such as programming and scripts,
explain to me the scenario where you'd even rewrite 1000x?

when a disk fills up, or you buy a more advanced disk, you might shunt data over from an old disk,
but that is just 1x,

maybe you rearrange the directory or move files, but usually the file data doesnt move, just the metalevel data is changed.

Now a temporary directory might have ongoing rewriting, but this is where the design of the filesystem is important. eg with Windows, I think the filesystem will keep progressing forwards through the disk, enabling undeletes.

when you delete, the data is still there, the next write goes even further along, till it reaches the end.

filesystems arent one thing, they can be designed the one way or the other, and eg write-once media constrain the design, eg they allow deleting, but you have to keep moving forwards.

ordinarily one wouldnt use an optical drive for temporary files, and one wouldnt use a thumbdrive, but would use either magnetic or SSD. because the thumbdrives are often slow, maybe today they are faster, but in the old days at least, they were a bit slow. also often they can only use the FAT filesystem, and eg this cannot cope with really big files.

eg I couldnt transfer the windows 11 iso to the Sandisk flashdrive, as its too big for the filesystem, plenty of free space on the disk!

have you tried copying the windows 11 iso file to the buffalo thumbdrive?

what filesystem does windows say it has?

Best USB thumb drive, Buffalo SSD-PUT at 2TB (2048 GB) costs $139,
amazon: https://www.amazon.com/BUFFALO-External-SSD-USB-‎‎SSD-PUT1-0U3B/dp/B09XS1972J/

So, with less money ($153 vs $139 USD), you'll get twice the capacity compared to BD-RE discs.
BD-RE arent cost effective, I use BD-R for archiving, and I use faster BD-R s by TDK,
which are 1-6x,

my BD-R's are 3.2 pence per Gigabyte, which at the current exchange rate is about 4 cents,

your Buffalo is 6.79 cents per gigabyte, so is 70% more expensive!

when you have a lot of data to archive, you look at the price, and you just rethink your MO for the fact that it is write once only, where you just organise the stuff more thoughtfully and better. you can always shunt the stuff to a new BDR if you had to, but I have only shunted between BDRs to give someone a copy of a recording, not for my own data. and that only maybe 2 or 3 times ever.

because it forces me to think, I use BDRs much more intelligently than flashdrives.


I end up with 100s of BDRs but each is a specific topic, eg some are music programs, some are films, some are music documentaries, etc. but the thing is my entire collection of BDRs fits in a box which isnt ginormous but is a bit heavy.

the constraint in fact makes me organise them better.



Not to mention considerably smaller dimensions, doesn't need additional hardware to use the USB thumb drive (compared to need of ODD for BD-RE) and much faster read/write speeds (616.9 MB/s sequential read speed and 543.9 MB/s sequential writes).

With this, i don't see how BD-RE has better value, when also factoring in the rewrite amount + need for ODD + taking up much more space + slower read/write speeds + noise of ODD as well.
I dont use BD-REs!

I use BDRs, and the write once isnt really a problem, its for long term archiving of stuff that I dont wish to delete, eg driver CD isos, photos.

eg I have a BDR of all the 35mm photos I have scanned, where I resized them and arranged the jpeg quality so they'd all fit.

I can then just take 1 disk, and show people the photos on their TV via a bluray reader machine.

the size of the ODD is insignificant compared to the hulking PC you need to read from the ODD or flash drive!

having it external is better as I can reconnect that to different PCs, and eg install different Windows versions on different machines.

the ODD only makes a noise when you read or write the data, usually most of the time it is silent.

writing is the noisiest, but the recording sessions are always one offs.

the BDR's are 3.2 pence per gigabyte, and I can put 25 Gigabytes on one disk, 25 Gigabytes is a lot of data.

I dont care if modern drives are vastly bigger than 25Gig, that doesnt change the fact that 25Gig is a huge amount of data. eg the windows 11 iso is less than 7 gig.

I have been using a 500G SSD for my main data for several years, and its still not full. the fact you can get 8T drives, doesnt change the fact that even 1gig is a lot of storage!

all the stuff you have ever written or thought about will easily fit in 1 gig of diskspace!

as regards speed, the biggest use for BDRs is for videos, and the entire point is they deliver the speed that video needs! eg for HD and 3D HD, you can record this to BDR, you cant to DVDs as that technology cant handle the speed.

if I cant find the CD driver disk, I can just burn another one from the archives, speed isnt so important, main problem is existence, can I make the driver disk exist!

and eg for installing Windows as a one off, speed isnt so important, eg whether it took 40 minutes or 20 minutes 2 years ago, so what?


Buffalo SSD-PUT 2TB costs $139,99. It being 2048 GB drive, will bring price per GB to $0,068 USD.

Which makes (after currency conversion) £0.055 or 5,5 pence per GB. Considerably cheaper than your BD-RE disc price per GB.
and my BDRs are 3.2 pence per GB, so are still ahead.

your Buffalo is cheaper for some optical formats, but not for BDRs.

and these BDRs were available 12 years ago in 2012 when I bought the bluray writer tv box, and your Buffalo cannot outdo them even today with ginormous advances in technology!


most writes are never rewritten, especially not archival writes. so the write once is only a restriction for general use, not for archival use.

we have too many files in this era to modify or move them much.

I havent counted files recently, but a person could easily have a million files on their disks,

its just not viable to manage those individually.

I probably have thousands of photos from my camera, and I may enhance a photo, but then that is that.

I read in the photo, spend 20 minutes enhancing it, save the enhancement to a different directory, not a rewrite as I always keep the original photo as a mastercopy,

and I generally never edit that photo any further.

now I might remap the photos to fit on a smaller disk or eg remap to HD to use as desktop backdrops

but we are talking a few transfers of the one file. it just is never 10000x, never 1000x, never even 100x, could be 5x over the years.

now a system temporary file directory, this could have a lot of rewrites, but the filesystem MO mitigates, if it always moves forwards. and I wouldnt think of using optical media for a temporary directory, its more for one off large files.


if you have 100 video recordings, the problem with using the Buffalo is those recordings cannot be copied to another drive, and cannot be used on any other machine, as they use a proprietory format which ties in the recording to the drive and recorder. you can only view the vids from that video recorder.

but with BDRs I can view them from any machine.

which isnt a problem with the technology per-se, but is built in restrictions to prevent piracy.

eg some tv broadcasts legally can be copied but only once or twice or maybe 3x for personal use.

where you cant copy a copy, only the original recording, and the machine counts how many copies you have made, once it reaches 2 (or 3?), it wont allow further copies to BDs.



When M.2 drive comes with it's own heatsink already installed, you don't need to use MoBo M.2 cover, since in general, it does the same thing as dedicated heatsink - albeit it is worse in that. So, whenever possible, use dedicated heatsinks (when needed) with M.2 drives, rather than relying on the M.2 cover that comes with MoBo.
yes, but the problem is the 990 Pro doesnt have a heatsink, but the 980 Pro does,

so the cover wont fit, and the 990 Pro then has no heatsink! at least no heatsink above, but it has a soft heatsink pad below.

there is just one cover for both M.2s!

the only other option is to move one M.2 to say the M2_4 socket, where it has a different plate.


the Ace plate has soft grey heat pads on both sides of the M.2, so its heatsinks might be higher grade than other mobos?

I have already requested a refund, where I will take the 980 Pro to the amazon counter, hopefully later today, and will get a 2nd 990 Pro.