Everyone assumes that a review encompasses every person's scenario.
No.
A review only describes what someone has seen on their system. For example, when I tested the ML360R, it performed better than all other AIOs at the time, on the same hardware, using the same tests. This does not mean it automatically is the same for anyone else as you cannot determine what someone else is using (for hardware), what their clock speeds are, if they run the same tests, the same versions of those tests and the same drivers between the hardware. I do because my testing rig is setup that way and because it has to be in order to maintain equality for all tests and hardware I evaluate.
Take a review with a grain of salt because that's what most public reviews are ....simply someone's perspective. Online reviews can be written by vendors themselves or by competitors...or paid to do so on their behalf for a sway in review results and 'stars'. I've also noticed a lot of reviews that like to 'counter' or 'bash' my review of a product only to find out that different hardware is being evaluated. Different CPUs generate heat differently and not every person's setup matches what might have been sent to me.
Were the BIOS on the pump updated before testing?
What are ambient room temps during all runs and are the reported temps a valid delta of those offsets?
Is the same thermal compound used for every single test?
Are the coolers provided a 1-hour burn-in to allow the thermal paste to evenly spread before being tested?
Keep these in mind when you make objective comparisons. Not all product reviews on Amazon or Newegg (or insert site here .com) are fully vetted...most are just upset users that want to have a voice be heard.