We did discuss this and ultimately felt that the GeForce SDR/DDR wasn't as revolutionary in its day as some of the others. Yes, it paved the way, but then so did the Riva TNT and TNT2 — there would be no GeForce if not for the original TNT card in 1998.
If we go by
Tom's testing back in 1999, looking at the 1024x768 results as being more representative of what people were using (I did buy a 1600x1200 monitor back then, but it wasn't great for gaming until the early 2000s with faster GPUs), the GeForce 256 SDR was only 15% faster than the TNT2 Ultra. It was a bigger jump of 38% against the TNT2, though, and the GeForce 256 DDR was 32% faster than the TNT2 Ultra.
The real issue was that, back in the day, most of those GeForce features (hardware T&L) weren't really utilized until two or three generations later. So it was mostly just a modest bump in performance and paving the way for the future.
Today, over 24 years later, the original GeForce cards created the name but they're far enough in the past that I don't think most people pay much attention to them. If we do the best Intel CPUs of all time, do we need to list the 8086, just because it was first? Some would say yes, unequivocally, and others would be just as adamant in saying no. 🤷♂️