Just got into playing Minecraft on a server with friends again after upgrading some hardware. I get anywhere from 100-200 FPS without shaders, but, as soon as I enable one, FPS drops to anywhere between 30-80 FPS with a lot of fluctuation. I have optifine installed, I have reinstalled java, I have allocated various amounts of RAM ranging from 2G-8G, I have reduced my render distance, played with v-sync and frame limit, I have disabled some AMD programs that were rumored to cause stuttering and FPS drops, I have taken discord off of its setting to use dedicated graphics, and I have tried all combinations of shader settings, and a few other things over the last week or so. I have googled this topic to death and have found nothing.
Specs:
OS: Windows 10 Home Edition
CPU: Ryzen 5 2600 @3.4GHz
GPU: Radeon RX 580 4GB (According to MSI Afterburner: Core Clock: 1350 MHz, Memory Clock: 1750 MHz)
RAM: 16GB 2666MHz (Considering overclocking)
HDD: WD 500GB, Toshiba 750GB
SSD: None
I play on the server with someone who has the R5 2600x and a GTX 970 and gets >100 FPS easily with shaders applied. It's just frustrating putting a bunch of money into a system for it to react like this; it should work well. Any help and/or insight is much appreciated. Thanks!
(So far I have only had low FPS in Minecraft if that helps solve the issue)
Specs:
OS: Windows 10 Home Edition
CPU: Ryzen 5 2600 @3.4GHz
GPU: Radeon RX 580 4GB (According to MSI Afterburner: Core Clock: 1350 MHz, Memory Clock: 1750 MHz)
RAM: 16GB 2666MHz (Considering overclocking)
HDD: WD 500GB, Toshiba 750GB
SSD: None
I play on the server with someone who has the R5 2600x and a GTX 970 and gets >100 FPS easily with shaders applied. It's just frustrating putting a bunch of money into a system for it to react like this; it should work well. Any help and/or insight is much appreciated. Thanks!
(So far I have only had low FPS in Minecraft if that helps solve the issue)