News Star Wars Jedi: Survivor Requires a Staggering 155GB of Drive Space

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

JTWrenn

Distinguished
Aug 5, 2008
257
171
18,970
What new codecs? H.266 has very little hardware decoding support so there's no point and audio doesn't really need anything new when AAC/Opus are like dozens of hours per gigabyte.
That is my point. We have stopped working hard on image compression for game tech and thus we end up with these massive ridiculous things. We need to get back to working on all fronts to make software good, not to mention getting it out the door in reasonable shape instead of shipping extremely broken software and fixing it for a year after release. The problem is actually more the textures now for the sizes we are talking about and that is not going to get help from h.266. We need new texture compression tech that allows for these 4k or 8k textures to not blow u p our ssds.
 
That is my point. We have stopped working hard on image compression for game tech and thus we end up with these massive ridiculous things. We need to get back to working on all fronts to make software good, not to mention getting it out the door in reasonable shape instead of shipping extremely broken software and fixing it for a year after release. The problem is actually more the textures now for the sizes we are talking about and that is not going to get help from h.266. We need new texture compression tech that allows for these 4k or 8k textures to not blow u p our ssds.
there is...sort of... ASTC developed by ARM and AMD......anyway opengl and vulkan supports it, microsoft is behind...so
 
Last edited:
  • Like
Reactions: KyaraM

JTWrenn

Distinguished
Aug 5, 2008
257
171
18,970
there is...sort of... ASTC developed by ARM and AMD......anyway opengl and vulkan supports it, microsoft is behind...so
It's just something that has fallen down the list of priorities for tech. It needs to get bumped back up. Also there are too many proprietary systems doing it rather than open source standards. That is the real issue. Companies have become much less collaborative...or at least are going through a cycle of proprietary software and hardware again.
 
It's just something that has fallen down the list of priorities for tech. It needs to get bumped back up. Also there are too many proprietary systems doing it rather than open source standards. That is the real issue. Companies have become much less collaborative...or at least are going through a cycle of proprietary software and hardware again.
well i dont think you can compress it too much without loosing quality, assest are double compressed atm, one is texture block compression, then in form of packs (zip compression for example)
 
  • Like
Reactions: KyaraM
That is my point. We have stopped working hard on image compression for game tech and thus we end up with these massive ridiculous things. We need to get back to working on all fronts to make software good, not to mention getting it out the door in reasonable shape instead of shipping extremely broken software and fixing it for a year after release. The problem is actually more the textures now for the sizes we are talking about and that is not going to get help from h.266. We need new texture compression tech that allows for these 4k or 8k textures to not blow u p our ssds.
Texture compression has different requirements than say what image format your phone's camera stores for later viewing. In particular it has to be fast, it has to be predictable, and it has to support random access.

The speed requirement is obvious. The compression technique can't spend a lot of time doing its work. Usually GPUs will actually implement the decompression algorithm in hardware.

Predictability is necessary because the rendering pipeline can't use a compressed texture to determine the color. It has to decompress the texture first. So not only does the decompression time need to be predictable, but the compression ratio also needs to be predictable so you know how much memory pressure you'll be putting on the GPU. Basically if you give the compression codec an image, it's going to typically compress to the same size regardless of content, unlike say JPEG or PNG which compresses to different sizes depending on the content.

Lastly, random access is important because it allows the GPU to only decompress what's necessary. Without random access support, the entire texture would have to be decompressed because it's very hard if not impossible to determine where in the image you could actually start decompressing.

I'm sure there are plenty of people who are working on better compression algorithms, but the requirements for textures doesn't make it easy.
 
  • Like
Reactions: KyaraM

tamalero

Distinguished
Oct 25, 2006
1,134
140
19,470
They claimed at the time that using something like MP3 would be too costly in terms of compute power.

WAV for sound effects I can understand, they're short and they should be ready to go at a moment's notice. But something like, I dunno, cutscene dialog? Nah, I don't buy their argument. Plus what makes their argument even less convincing is MP3s were used in Nintendo 64 games like Perfect Dark (it has a license acknowledge on the boot screen for it).
How the hell MP3 would be too costly of compute power? To WHO?