Comments by "Scott Franco" (@scottfranco1962) on "GNOME Takes Big Step Towards Fractional Scaling" video.

  1. 1
  2.  @boembab9056  If you scale the screen, you are letting the OS/Presentation system draw things bigger for you. If you do it in the application, the application is doing the "scaling", QED. So lets dive into that. In an ideal world, the presentation system is taking all of your calls, lines, drawings, pictures, and scaling them intelligently. In that same world, the Easter bunny is flying out of your butt. All the system can really do is interpolate pixels. Lets take a hypothetical for all of you hypothetical people. They come out with a 4m display, meaning 4 megapixels, not 4 kilopixels. You scale %100, meaning "no scaling". All of the stupid apps look like little dots on the screen because they are compressed to shit. Now we scale some 1000 times to get it all back. If the scaler does not consider what was drawn, but just pixels, its going to look terrible as scaled, just as if you blow up a photo on screen far in excess of its resolution. Now the apps that are NOT stupid, but actually drew themselves correctly, are going to look fine, perhaps just that much smoother because they took advantage of the extra resolution. Now lets go one more. I know this is boring, drink coffee, pay attention. Drawing characters at small point sizes is a problem right? People worked out all kinds of systems like "hints" to try and make fonts look good at small point sizes like 5-8 points. But you bought that 4k monitor and that 4k card, and THEN you bought a fast CPU to push all of that data around. Guess what? That 5 point problem you had is gone. Just gone. There is sufficient resolution to display fonts on screen down to the point where you can barely see them. Now ask yourself. How does a scaling algorithm do that unless it DRAWS the characters at that resolution? Keep in mind that programmers spent decades on true type formats and computed character drawing to match mathematical curves to pixels. Is an interpolated scaler going to do that? No, no it is not. Peace out.
    1
  3.  @boembab9056  Look I know you are a smart guy, but think about what you are saying. If the application knew how to take care of its own scaling, the OS does not need to do anything, no scaling at all. The typical flow is: 1. If the application has never come up before (default), it takes the measure of the screen, then presents itself according to a rule of thumb, say 1/4 the size of the screen. 2. Size the fonts according to the onscreen DPI. Ie, it you have 12 point type, then choose an onscreen font accordingly. Points are 1/72 of an inch, so 12 point type is 0.16 of an inch in height ON SCREEN. 3. Set other dimensions accordingly. I personally use the point size to dimension everything else on screen, and I have found that works well. 4. If the application has executed previously, then just use the last window size. That is a reasonable expectation for the user. Do that, and no scaling is required. The app knows what to do. If you think about it, what scaling REALLY does is accommodate stupid applications that don't understand how to scale themselves properly. I follow all of the rules above in my applications. I'll readily admit that I had to do some work to get to 4k displays. Mostly it was because I used standard (and it turns out arbitrary) measures to size items in the apps display. Also when moving to 4k, I implemented a standard pair of keys to let the user adjust the size of the apps display (ctl-+ and ctl--, same as Chrome and most other apps). This is the correct solution. Rescaling all applications because SOME programmers don't know what they are doing is not the right solution, and, indeed, it actually punishes the applications that did the right thing by messing with their scaling instead of letting them do it themselves.
    1