InvalidError :
banding is already quite visible at 8bits in the SDR spectrum when you have monochromatic color scales or any other sort of steady gradient across a screen large enough to make the repeated pixel colors readily noticeable. It may have been deemed good enough at the time the standard was defined back in the days of CRT SD TVs where nearby pixels blended into each other to blur boundaries and resolution wasn't high enough to clearly illustrate banding. It isn't on today's FHD LCDs where pixel boundaries are precisely defined and an 8bits gradient across 1920 pixels produces clearly distinguishable bands roughly eight pixels wide.
I remember when I discovered banding was even discernible @ 8-bits, on a decent, properly-adjusted CRT computer monitor. I was pretty astonished, but I had made the test pattern myself and could clearly see many of step boundaries where they were supposed to be.
Video commonly uses a smaller range for luma/chroma: 16-235 / 16-240. Conversion between this and 0-255 RGB can exacerbate the effects of banding.
BTW, if we used a linear gamma, banding (and other noise) in dark areas of the image would be much more noticeable than it is.