VGA doesn't have a standardized way for the TV to tell the video card what resolutions and refresh rates it's capable of. So video cards tend to be very conservative in their settings when you connect via VGA. This is because back in the CRT days, you could destroy a monitor by trying to drive it faster or at a higher resolution than it was capable of. That can't happen with LCD monitors, but this overcautious attitude persists.
You can manually hack the settings, but I don't recommend this as it's extraordinarily difficult to get exactly right. Monitors with VGA input frequently came with "drivers" that you'd install on the computer. The "drivers" were just a list of supported resolutions, refresh rates, and clock timings so the video card would know what the monitor was capable of. But if you really want to try, AMD lets you do it.
https://www.amd.com/en/support/kb/faq/dh-032
You are usually much better off using HDMI.
ALso tried via hdmi but when connected it shows the image chopped , the image is bigger than the monitor .
That's called overscan. It too is a relic of the past which persists today.
https://www.howtogeek.com/252193/hdtv-overscan-what-it-is-and-why-you-should-probably-turn-it-off/
When hooking up a TV to a computer, you need to disable overscan. Unfortunately, every TV manufacturer has a different name for it and does it a different way. All the info I could find on your TV model was in Spanish, so I couldn't find a definitive way to fix it on your TV. You can try these generic JVC instructions I found. Although some other info I found says not all JVC TVs have a way to disable overscan.
https://www.techwalla.com/articles/how-to-fix-the-resolution-on-a-jvc-tv
Your video card settings should have a way to tweak its output to cancel out overscan. But it's not the best way to do it, and should be used only as a last resort if the TV doesn't support disabling overscan.
https://www.amd.com/en/support/kb/faq/dh-022