ye olde gfx card q
category: gfx [glöplog]
Hi there,
as you might have noticed NV and AMD have released new lines of gfx cards, and while they are impossible to get at the moment (unless you are willing to pay like 100-200 bucks on top) they will become available at some point in the future.
So it looks like AMD is back in the game with the RX 6800 (XT) going agains the RTX 3070/3080.
Reviews so far state that NV still has the better RT performance in existing titles but less VRAM. NV also has nifty things like DLSS and the extra cores while AMD *seems* to focus on rasterization power via a higher base-clock.
Regarding VRAM every test I read only managed to max out the 8/10GB with really high resolutions like 4k and up but im still on WQHD anyway and will probably keep my current screen for some time to come. Also the 3070 is a biit cheaper.
So Im really torn between a 3070 and the 6800 (without XT) but I remember having an AMD card used to be "Well, Only half of the demos actually work, but if they do its pretty ok".
While gaming performance is of course also interesting im curious if that has changed in the last couple of years and if you think the software and RT headstart NV got at the moment will change once the next-gen consoles with AMD gfx rule the land...
Coming from a gtx 1060 im pretty happy with I tend to lean towards NV again but there might actually be good reasons not to, this time around...
as you might have noticed NV and AMD have released new lines of gfx cards, and while they are impossible to get at the moment (unless you are willing to pay like 100-200 bucks on top) they will become available at some point in the future.
So it looks like AMD is back in the game with the RX 6800 (XT) going agains the RTX 3070/3080.
Reviews so far state that NV still has the better RT performance in existing titles but less VRAM. NV also has nifty things like DLSS and the extra cores while AMD *seems* to focus on rasterization power via a higher base-clock.
Regarding VRAM every test I read only managed to max out the 8/10GB with really high resolutions like 4k and up but im still on WQHD anyway and will probably keep my current screen for some time to come. Also the 3070 is a biit cheaper.
So Im really torn between a 3070 and the 6800 (without XT) but I remember having an AMD card used to be "Well, Only half of the demos actually work, but if they do its pretty ok".
While gaming performance is of course also interesting im curious if that has changed in the last couple of years and if you think the software and RT headstart NV got at the moment will change once the next-gen consoles with AMD gfx rule the land...
Coming from a gtx 1060 im pretty happy with I tend to lean towards NV again but there might actually be good reasons not to, this time around...
nothing personal
your post is mess
1. you can not call gtx1060 "old"(by technical specs etc), from 2006 till today nothing changed in "GPUs" you can launch everything from today on 2006 years video cards(Nvidia 8600-8800 series)(everything include RTX)
2. "Only half of the demos actually work" - what OS Win/Linux? Graphic API DX9/10/11/12 or OpenGL 1/2/3/4 or Vulkan?
On every on this graphic api anyone can make "single triangle app" that will work only on AMD or only on Nvidia, because "programmer" of those demo do not follow official specs and drop half of "needed" commands to make his "exe" smaller- this is in most cases reason why demo work only on single GPU Vendor or even on single GPU driver version...
AMD have more strict requirement from applications to follow specs, this is second reason why "broken code" foes not work on AMD
3. DLSS does not improve performance for homemade apps(that most of demos), only if you work in very large AAA corporation, then maybe you can use DLSS in your application/game
4. to develop/launch raytracing demos you do need RTX videocard, people develop Raytracing demos from 2000-x(and earlier) and make them work in real time on that old hardware...
so I mean - there nothing to be happy about actually, 3XXX Nvidia is another videocard and 2 years latter new series come... if you happy to throw moneys into Nvidia corporation today do-as you wish
my personal oppinion about videocards - they outdated already, all of them, Intel and AMD understand it, using something more than "modern Integrated GPU"(like AMD Vega 8) is not worth it, and 1-2 years latter those GPU also got "real time raytracing" acceleration support...
your post is mess
1. you can not call gtx1060 "old"(by technical specs etc), from 2006 till today nothing changed in "GPUs" you can launch everything from today on 2006 years video cards(Nvidia 8600-8800 series)(everything include RTX)
2. "Only half of the demos actually work" - what OS Win/Linux? Graphic API DX9/10/11/12 or OpenGL 1/2/3/4 or Vulkan?
On every on this graphic api anyone can make "single triangle app" that will work only on AMD or only on Nvidia, because "programmer" of those demo do not follow official specs and drop half of "needed" commands to make his "exe" smaller- this is in most cases reason why demo work only on single GPU Vendor or even on single GPU driver version...
AMD have more strict requirement from applications to follow specs, this is second reason why "broken code" foes not work on AMD
3. DLSS does not improve performance for homemade apps(that most of demos), only if you work in very large AAA corporation, then maybe you can use DLSS in your application/game
4. to develop/launch raytracing demos you do need RTX videocard, people develop Raytracing demos from 2000-x(and earlier) and make them work in real time on that old hardware...
so I mean - there nothing to be happy about actually, 3XXX Nvidia is another videocard and 2 years latter new series come... if you happy to throw moneys into Nvidia corporation today do-as you wish
my personal oppinion about videocards - they outdated already, all of them, Intel and AMD understand it, using something more than "modern Integrated GPU"(like AMD Vega 8) is not worth it, and 1-2 years latter those GPU also got "real time raytracing" acceleration support...
so you are saying getting hw accelerated RT is useless anyway because some demos already implement it in software and integrated gfx should be enough for everybody?
Quote:
from 2006 till today nothing changed in "GPUs"
No.
nothing changed indeed... float[] -> GPU -> pixels! i'd recommend getting a TNT2!
TNT2 Ultra with overclocked memories (before they fade :( )
Quote:
your post is mess
I don't agree. The post reflected exactly what most people who plan on getting a new graphics card in the near future are thinking.
Quote:
1. you can not call gtx1060 "old"(by technical specs etc)
That's true, feature-wise a GTX 1060 is fine as long as you don't want or need hardware raytracing or DLSS. Performance-wise, though, the current crop of cards is easily twice as fast (if not more), so I can understand the desire to upgrade.
Quote:
2. "Only half of the demos actually work" - what OS Win/Linux? Graphic API DX9/10/11/12 or OpenGL 1/2/3/4 or Vulkan?
OpenGL, mostly, and in most cases due to the way that sloppy demo coders exploit the sloppy GLSL compilers nVidia used to have. Sure, that's the programmer's fault, no doubt about that ... but if you just want to run the f*cking demo, it may well be that you're out of luck with AMD (or Intel, for that matter). In many cases, you could fix things on your side with shader replacement tools, but fixing minified shaders really isn't what you want to do when you actually just want to watch some 4k intro ...
Quote:
3. DLSS does not improve performance for homemade apps(that most of demos), only if you work in very large AAA corporation, then maybe you can use DLSS in your application/game
... or if you want to play said game. There are people who don't use their GPUs exclusively for demoscene purposes :)
Quote:
people develop Raytracing demos from 2000-x(and earlier) and make them work in real time on that old hardware
Yes, raytracing (the technique) doesn't s require RTX (the technology and its APIs). So what? If you want to run something that implements raytracing using those newfangled HW raytracing APIs, and only with those, you need the HW support for it. It's nothing that you, the consumer, can control.
quick subjective opinion based on durability and low-fps: Nvidia = fast german tank you can use for 2 years or longer, AMD = solid russian tank you can use for 5 years or longer
They should have called it VI Tiger RTX instead of Titan RTX!