I am not comfortable with temps that high being a new normal.
I figure many won't be. The option to adapt is there... or a leash can be put on it, if it makes one feel better, at a minor performance loss, which no one likes to hear[even if the chip is operating outside its optimal spec out of the box]. If the silicon can tolerate it, cool(no pun). It's not like it's flesh and bone, which can't tolerate nearly as much.
The room I use for work and play cooling/AC cannot deal with those temperatures and remain comfortable or money conscious for the entire house.
Are you relating operating temperature to the heat dispersed into the room? 'Cause that's not how that works. Your comfort, and ambient temperature is influenced by the power used. The energy used doesn't just disappear, it moves. You could have 2 14900Ks rendering for an hour:
A)No power limits, using up 300w, and hanging around 85C on all cores, on a custom loop.
B)Long and short power limited to 200w, and hitting the 100C default with an air cooler.
Sample A is going to warm things up faster, as it's using and releasing more energy into the room on average.
One could argue that the Sample A would finish a task faster, so Sample B would be the bigger heater instead... but would it really? The previous gen has been shown to not lose much in the way of enforcing lower power limits.
Those 4090s, with their impressive thermal management, are still dumping around 300w or more into a room. Someone's room definitely isn't getting uncomfortably warm because their fancy new cpu has a few cores running at higher temperatures than the gpu core in a game.
Even then, a PC still pales to what a refrigerator can do to a room.