Comments by "Winnetou17" (@Winnetou17) on "The Friday Checkout"
channel.
-
64
-
38
-
11
-
9
-
It is a glorified RDP hardware.
And 2 x 4K doesn't need THAT much. You can have good 1080p quality with 10 Mbit/s. 4K is 4 times the number of pixels of 1080p, so AT MOST it would need 40 MBit/s. I say at most, because video streming will be (of course) done using compression (probably AV1) and if the video is not very "busy" in the amount of small details, then the extra pixels won't add much when compressed.
So 2x4K would be 80 Mbit/s. Let's round it to 100 Mbit/s and have them at 60 FPS. And no, doubling the framerate doesn't double the size, again, because of compression.
Of course, if you have more, it will be better. But I wouldn't be surprised if it would actually be totally fine with 100 MBit/s. Because normal applications have a lot of spaces that are basic and simple. Like margins on a website or on a Word document, that gets compressed very easily. Or the background color of the Excel cells, those are easy to compress too, and also don't move much, so you might get it to look artifact-y only when you scroll, but look crystal clear when you're just typing stuff and 95% of the screen is static. The taskbar of the OS and the menu/top of the app(s) are also very static, so in reality it's like you have fewer number of pixels.
7
-
7
-
4
-
3
-
3
-
3
-
Lunar Lake is only half of next gen, meant for very efficient ultrabooks. The other half is Arrow Lake.
While I normally don't like soldered memory (or soldered anything) I'm very excited about Lunar Lake. It should be much more efficient, and it should FINALLY kill the gigantic horde of "bUt aRm iS sO mUcH mOrE eFfiCieNt, die x86 hurr durr".
If having proper controls and drivers and software (like Linux) it should finally be on par with Apple's M powered laptops, as far as ability to be efficient (latest manufacturing node, memory on chip and not a massively bloated OS + apps - like Windows). That is, it won't be dragged down by non-ISA factors that make x86 seem less efficient than ARM. Well, I do think it is less efficient, but to a much smaller degree, like 5-10%, nothing massive and certainly not something to ditch x86 over. Hopefully with LunarLake we'll see the real difference.
3
-
2
-
2
-
2
-
@protocetid Heat and energy efficiency go hand in hand. If some varies differently from the other, then it's not something x86 vs ARM, but that particular implementation, the manufacturing node used, the IHS, things like that.
The software freedoms... well, that's still kind of not something that's exclusive to x86 or against ARM, but something more on Android and iOS and Google Services. Pinephones exist, and you can install everything there. Another (non phone) ARM that is very free is Raspberry Pi. So, it's not ARM's fault that nowadays smartphones are so incredibly closed and spyware. It's Google + Apple + Samsung mainly + the rest of manufacturers.
And, yeah, it's true, nowadays the performance is so good, that for normal apps, especially if you don't have so much spyware on it, it's already something that can be easily be done by most CPUs, even very low end ones.
2
-
2
-
@protocetid Actually, TDP is not how much a chip uses, though it's not that far off either. When you don't have something else, and if you don't need/expect good precision, it can be used instead.
The thing is that TDP is how much HEAT a cooler for that chip should be able to dissipate. Which if you think about it, that and how much the chip uses should never be the same, unless the chip is literally a resistor. As an example, on AMD's Zen 4 chips, the top desktop ones, they have 170W TDP, but they can consume (without overclocking) 230W in sustained load. Fortunatly, this is one of the bigger deviations, usually the TDP is closer to the actual consumption. Well, PEAK sustained consumption! The chip, if not in fully load, will always consume (much) less. And on short burst, it can consume (much) more.
Getting back. Steam Deck is 15W TDP, but it's optimized to run at more like 3-9W. In games like Dead Cells 2 (which is a 2d indie game, pretty lightweight, but still far from idle-like power required) the OLED version of Steam Deck can run for over 8 hours. With a 50 Wh battery, that means that, in average, the chip + screen + wifi + audio, I think without bluetooth, consumes about 6W. Which makes me think that the chip itself is consuming like 3-4 W.
Still, given that the Steam Deck has only 4c/8t, that's not exactly high end. Current phone flagships are certainly both more performant and more efficient. Not sure how it competes on GPU performance. A typical phone battery nowadays has 5000 mAh, which, given that the Li-Ion batteries usually hover at 4V (between 3.7 and 4.2), that makes for a battery capacity of aprox 20 Wh. Less than half of what the Steam Deck has.
So the Steam Deck's APU (which I still consider the closest in x86 space to what a phone or a very efficient tablet/ultrabook would need) is not that efficient as compared to the current smartphone chips. Though, it is also built on 6nm, while the most recent chips are on 3nm, almost 2 generations newer, which is a pretty big difference.
So, overall, I think that on the hardware side, while it will most likely be a setback in terms of performance or (maybe even and) efficiency, I think that if they wanted, both Intel and AMD could come up with a chip for a smartphone that still has decent efficiency and performance, just not flagship level.
Now, on the software side, the advantage with Linux ... that is, GNU/Linux phones (Android, technically, is also Linux) is also the control that you get. And, I guess, a bit of compatibility for the software that's made for the desktop. I wouldn't say it's a big demand, unfortunately. Most likely just techies like us, and maybe privacy nerds.
Still, it is nice to see how far Pinephone got, even though it seems like what they have is a bit too low end. The chip itself can be very efficient, they don't have a lot of cores or overclocked them or anything, it seems that the firmware and drivers they use or something is still not up to the task. Or maybe everything is rendered with the CPU instead of the GPU, dunno. But the chip itself is pretty common ARM chip with 4 A53 cores, those can totally be efficient.
Oh, and good point about Waydroid. Haven't checked it, but from what I remember, you can already run a lot of apps through it. So you can get the best of both worlds with it.
2
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
1
-
@xiphoid2011 You're not wrong, but for many, the money they got actually helps them train better. Because many live in countries where there's massive underfunding and the athletes are more of an exception rather than the norm. It's the harsh reality.
I know this is the case with my country, Romania, literally all the medals we got are the sole merits of their respective athletes and those very near them, and basically 0% merit for the state/government or the mass-media (which has like 30% football aka soccer news, 30% tennis news, 30% gossips, usually around the people involved in football/soccer, 10% at most all the other sports combined). And Romania is, overall, kind of mid-level in terms of how rich it is, there's certainly countries which are much worse off.
So, yeah, $1000 extra can be a significant boost in the revenue, helping with the stress of travelling and having adequate equipment and nutrition and people around the athelte also getting to a decent level of payment and so on.
1