This is a Swweeeettt concept. Its gonna be fun like a science or statistics experiment! I think taking prices by the yearly level would probabl not be enough. We should probably take them quarterly. It might be interesting to take it at even a montly level, but it probably wouldn't be worth it as our graph would be entirely conglomerated and confusing to end-users. Not only will this be based up on statistics & science, but also Economics as well. What we will be figuring will be the THeory of Market value depreciation for the graphics cards market. By using statisctics, we will realize that not every product will depreciate at the same rate, but will be dependednt on certain facotrs. We can make a scattor-plot or a bellcurve to help getaround these types of problems because compling a "market depreciation constant" from just merely the average of the total sum of all graphics products would produce disasterious results. This is mainly due to the wide variety of types of products. It would be helpful if we evaluate this in several views. The first wasy to do this is to divide in categories. But even these will be relative and subject to inconsitenceies, but not as much as if you include all products in the same basket.
We could evaluate in terms of the following:
1. Products of an individual Company ATi
2. Products from an individual company Nvidia (or others)
3. High end cards (upon offiicial first release)
4. Mid-range cards (upon offiicial first release)
5. Low-end cards (upon offiicial first release)
6. Direct X support.
7. Memory amount
8. Memory type
9. Other suggestions you might have.
This is gonna be cool Flinx!
My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!