Comments by "Cestarian Inhabitant" (@cestarianinhabitant5898) on "JayzTwoCents"
channel.
-
50
-
20
-
12
-
11
-
10
-
10
-
6
-
5
-
4
-
3
-
3
-
3
-
3
-
3
-
2
-
I'm mildly impressed, it's late, but it's on par with the 2080 and the 1080-Ti. It doesn't have anything that directly competes with RTX (except that their cores were always more suited to compute work than nvidia's which may be enough to compensate, in theory) but it makes up for that with 16GB of HBM2 competing with 8GB of GDDR6. Where the 2080 will barely have enough VRAM to run everything, the Radeon VII has so much of the stuff it's got much more potential longevity as future games will start using more and more VRAM, the 2080 will eventually fail to keep up, but the Radeon VII will not. It'll also be used for CAD work.
And another beauty of it is Linux users will flock to buy this. AMD's Linux driver is better than Nvidia's (for a very long time Nvidia's were much better than AMD's on Linux, but that all changed when AMD made an open source driver called amdgpu, when it was new it wasn't up to spec, but over the years it's caught up by now) there are many reasons for Linux users to prefer AMD just for this alone, chief among them being that it's more convenient. You put a (supported) AMD card in any computer and install Linux on it? It will be ready for gaming as soon as you boot it. For Nvidia? It will (at least for recent cards) use a generic driver and require you to install Nvidia's driver manually, which for some distributions is a pain in the neck, especially for the less savvy.
Nvidia put some effort into their Linux driver, their Linux driver is actually all things considered quite good, but they're dicks about certain things, the GUI control panel for example looks like it belongs in the 90s, a lot of their technologies which they use for marketing aren't supported in them; if it requires you to turn on a setting to enable it in-game, it probably won't work (so no DLSS for example, GSYNC support was added in like just the other day, finally. It didn't work for a long time until now... They also occasionally decide to develop their own library implementations for things and refuse to support alternatives (Wayland, a very important developing software for linux that many are hoping will replace X11 for example, is generally implemented with GBM, an API for buffer allocation, Nvidia developed something they call EGLStreams or something and refuse to support GBM. Nobody really wants to use Nvidia's shit when the other thing was already in place and there are no demonstrable reasons why EGLStreams would be any better. But we all know how Nvidia likes to overcommit on random bullshit they develop, like Gsync, despite competing tech being more sensible, like Freesync)
Meanwhile AMD is mostly free of all that bullshit, and since their driver is open source, if support for something is missing, anyone can just add it themselves if they have the desire and skill to do so.
Speaking of Linux anyways, Linux gaming has become a lot more viable in 2018. You can probably expect about 80% of your Steam game library to run without much trouble on it now thanks to steamplay/proton, most of them (especially DX10/11 games) will run at 80-90% native performance too)
The performance of the card lines up with what's displayed here on Linux as well (with AMD's open source driver vs Nvidia's closed one) https://www.phoronix.com/scan.php?page=article&item=radeon-vii-linux&num=1 the mean result of all the benchmarks combined marks the Radeon VII as 12% faster on average than the 2080 on linux (sometimes it wins by a large margin, sometimes it loses, but usually not by a large margin). It also seems like it's compute performance with properly optimized usage is potentially above the 2080-Ti (although I have no idea if RTX cores were also being used here, but it's semi-irrelevant since if it isn't being used then it's because no one is supporting it yet)
2
-
2
-
2
-
2
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Top 5, allright in no particular order, and excluding browsers and the basic shit like steam and discord...
1: K-Lite Codecs Standard (Configured to use madvr, and then I configure madvr a little extra to get some smooth-motion and that nice ass upscaling algorithm all tweaked)
2: QBittorrent (Like utorrent, but not crap)
3: 360 Total Security (Configured to use all the available active antiviruses), it's probably the best free antivirus software. I always used avira before this which is probably still the second best.
4: MSI Afterburner, I use it both to OC my gpu and to monitor my hardware when in-games with the OSD, quite useful.
5: Autohotkey (besides it's various uses in games (for example remapping your arrow keys to WASD for games that don't support changing your keybinds (like rpg maker and various older games) I also use it to keybind switching between desktops the same way I do it on linux, e.g. ctrl+f1 through f6 changes to desktop 1-6, which is a fairly vital function for me to avoid cluttered windows all over the place...)
Linux edition:
1: mpv (the tricky part here is finding the right configuration to get the best image quality, enable motion interpolation and whatnot, but the result is more or less equal or better than k-lite)
2: QBittorrent ( :D )
3: Conky (and I set it up so I can monitor my cpu/gpu temps from the desktop, cpu usage, memory usage, which applications are using the most cpu/memory, and my active network usage, it's real freaking handy but configuring it just right is a bit of work)
4: Lutris (wanna play windows exclusive games? Use steam & it's proton. Game not on steam? Use lutris)
5: Audacious (There's a lot of music players available on Linux, but I prefer this one over all of them, the reason being that it's not a music library app like itunes or whatever, it's functionally closer to winamp or windows media player, there's no extra complicated bullshit on top. Just a music player, and some basic tools like an equalizer)
1
-
1
-
This is a good thing, intel did not do the smart thing and lower their prices, they did the stupid thing and went for a quick cash grab. I hope this will hurt them so AMD can thrive a little for a while.
I want to see the companies compete, seriously compete, especially in the pricing department, but right now with Intel's resources if they would aggressively lower their prices to compete with AMD, Intel could probably crush AMD as it is due to the sheer difference in available cash money.
I've had issues with intel for years now because of their absurd pricing I still used their products of course, there was no other choice but now that there is, I'm gonna switch over to AMD first chance I get. Then again I was gonna do the same for the AMD RX line of graphics cards since they've got comparable price to performance to Nvidia's competition, but then Nvidia went and chopped the prices of the 900 series by half so I got myself a 980-Ti instead (for just a bit more than an RX 480 or gtx 1060 would have cost!) that was a deal too good to refuse.
Truth is I want both companies to succeed so that they can compete, intel going all suicidal for a while is a good thing because it'll level the playing field between intel and AMD a little.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
Damn overclocking is pretty fun, taking an old processor and squeezing it to the performance of new processors, feelsgoodman. My new 4790k is outperforming stock 6700s by a noticable margin on benchmarks. I just tried to get the highest performance I could within the 1.3v safety limit I hear 22nm processors tend to have. (I could only get it to 4.76ghz stable though, with 4.14ghz base clock, I almost got 4.44ghz stable base clock though, almost. but I'm pretty happy, 947 multi threaded and 196 single thread scores in cinebench, definitely worth it, I started out with like 805/170) but it's my first overclock, think it's not bad.
Cache and RAM are a pain in the neck to tweak though oh my fucking god.
The hardest part was finding the right aircooler that fit in my case.
Although I?m not sure what safe operating temps are, my idle temps go pretty high (~50°C) but my load temps hover around 75-83.
1
-
1
-
1
-
1
-
1
-
I bet a lot of handyman type guys got very triggered watching this.
Also Jay, it's ok if you don't know how to make things look pretty, here's what you do: Go minimialist, and then just buy one potted plant (fake plant, you know, so you don't have to water it), and put it somewhere, and you're good, it looks great. Only other rule is that everything must have the same 1 or 2 (max 3) colors, except the potted plant, which can only be green.
1
-
1
-
1
-
1
-
1
-
1
-
1
-
I believe that the problem with RTX, and why more developers don't implement it, is that it's Nvidia exclusive tech, this means it's extra work being done only for Nvidia card users, the AMD customers won't get to enjoy it even if they have cards powerful enough for it, and also, perhaps most damningly, it cannot work on any platform other than PC, and a lot of games get most of their players from consoles making it a wasted effort.
Additionally, further down the line in the future where we don't know if people might even end up gaming on integrated graphics (I mean AMD's integrated vega series was a pretty huge thing since it allows you to play a lot of games decently well already and I imagine this will only improve over time), there won't be any forwards compatibility with any such advances because of the nvidia exclusivity.
RTX is great technology. However making the software/engine end of things Nvidia exclusive as well is a huge fuckin mistake; what's more, it makes them look like complete jackasses, especially since AMD has a strong tendency to do the exact opposite and make all software solutions open (even going as far as open source a lot of the time) and not directly dependent on their own hardware (e.g. works on nvidia cards too).
As for DLSS, DLSS is also great technology but it is severely limited, it's a bit of a pain in the ass to implement, more of a pain in the ass to maintain said implementation and is potentially incompatible with modding (e.g. things like texture replacers or model replacers in particular, or new added content via mods; DLSS requires a deep learning AI to be trained to do the (honestly amazing) upscaling on a per-game basis and it needs to be accordingly retrained if any textures or models change or new content gets added, which is why I call it limited)
I understand how they like to advertise their cards at their best with DLSS enabled, but it's honestly misleading, the 3090 is not really an 8k capable card as they claim, it's only truly 4k capable, at least from the gamer's point of view, unless the game supports DLSS 2. Great thing to have, when you have it, but most of the time you probably won't have it.
These are no doubt impressive cards, but the way they implement RTX makes them look like morons, and the way they rely on DLSS to falsely advertise their cards as superior to what they actually are makes them actual assholes. The only saving grace is the more sane pricing strategy and elevated base performance. The nvidia exclusive tech is still, as always, a letdown in some way or another.
1
-
1
-
1
-
1
-
1