Comments by "C S ~ \x5bDuke of Ramble\x5d" (@DUKE_of_RAMBLE) on "" video.

  1. I might be wrong as I'm not in the know on the actual machinery capabilities/longevity for the various process nodes, but I know that the turnover rate for a lot of the machines from the last decade is pretty high, due to the smaller and smaller manufacturing processes and new processing methods. With that said, 35nm would have come in around roughly 2010 (45nm was 2008), so I have a hard time believing that three year old machines would be capable of that since they'd have been made during the time of 600nm lol 2 decade old puts it inbetween the 130nm and 90nm nodes, and still an absurdly long time ago in this context. That being said, you may be correct on you machinery dates, just incorrect on the node... From what I turned up after a rudimentary search, a news article on mid-April on a site I trust with hardware news, TechSpot, had mentioned this (emphasis mine): "Approximately 420 billion rubles ($5 billion) will get invested in developing newer fabrication nodes and ramping up production. Russia aims to ramp up the local chip production using a 90 nm node by the end of the year. By 2030, they intend to manufacture chips using a 28nm process technology, something TSMC did in 2011." So just about 20yrs ago like you said, is when 90nm was the latest node, just as I said. 😁🤘 (oh, and that $5bn estimate by Russia is fucking hilarious... then again, they probably hope to mostly steal their way to that goal... or maybe that focus is only for R&D 🤷‍♂️ I'll just chalk it up to a "delusional Russia number" lol) HOWEVER, that aside, this all isn't to say they would be unable to manufacture capable and "fast" chips on that 90nm node. They will just require more power (significantly, in terms of our computer's CPU) in order to operate, but their actual PCB footprint isn't really going to increase — or it shouldn't have to, technically speaking at least. Although, when it comes to SRAM or other memory types, that WOULD require more PCB realestate, as each module wouldn't be able to store as much on account of the far-less dense manufacturing node...
    10
  2. 5
  3. 4
  4. 2
  5. 2
  6. 2
  7. 1
  8.  @dianapennepacker6854  [note: I'm also a lay-person, I just have enjoyed computers long enough to have picked up rudimentary knowledge; however, this means there could be glaring errors here] AMD only designs their hardware –actually, this is complicated, I'll touch on it later– and then "foundries" like TSMC, Samsung, or Global Foundries apply that design to a large Silicon wafer (on AMD processors, this is where it was "Diffused"; ie "Diffused in Taiwan" if TSMC). Dozens to hundreds get patterned out per wafer, depending the size of the "die" (the entire CPU, GPU, or whatever designed logic is being made). But then, once they are... we'll say 'etched' to simplify the process... once etched the big wafer disc (which look beautiful once etched; https://upload.wikimedia.org/wikipedia/commons/f/ff/ICC_2008_Poland_Silicon_Wafer_1_edit.png ), the dies get individually cut out, then tested quite extensively with today's capabilities, whereby its determined what sort of defects they have – if any. Which today's designs are modular enough that a defect doesn't always mean it goes in the garbage. If say the defective portion is for example, one of a CPU's multiple processor cores, they can "laser cut" parts of that circuit to disable it. So instead of being an 8-core CPU, it gets turned into a 6-core (they disable them in pairs usually, for various reasons). Then all these dies get sent someplace else, one of multiple places AMD uses, although I think these days it's Malaysia (on the CPU, this is the "Made in" portion). This is where they get packaged and become the product we are familiar with seeing. [Note: actually, I might be wrong, and the assembly plant might get sent the entire, intact wafer. Heck, anything from that testing to defect laser cutting process, may all occur here 🤷‍♂️ heh] As for the "this is complicated" part I mentioned... AMD, for a long long time, always played second fiddle to Intel. Then the fates shined on AMD at the end of the 90s, when their new CPU architecture outperformed Intel's (significantly, at times) and they maintained this lead until ~2005-06. Then Intel started to claw its lead back, but they had also started to do some shady (to later determined to be illegal) business deals with all the major computer manufacturers, so that they would not sell as much AMD-ewuipped systems or to only put them in less-favorable computers people wouldn't really want (I only mention this because it indeed hurt AMD's bottomline, which is directly relevant because...) Then in 2008, AMD spun-off their chip foundry and manufacturing side, which became its own entity called Global Foundries. They, through agreements, are who AMD would use exclusively (or nearly) for almost 20yrs. More recently "GloFo" has trouble getting below 12nm and AMD now uses TSMC almost exclusively. heh There's less to say about Intel, as they do it all in-house. I don't know if AMD fabricated chips for others during the time they were still a foundry, but I think so... GloFo, of course, did take outside work for whomever needed it. Whereas Intel did not take any, always having enough in-house volume to take it up. Although after AMD once again took the performance crown back, I think Intel decided to open its doors to others... Not because they had less demand (their own nearly decade-long 10nm woes meant they had TOO MUCH fabrication work), but because they needed extra income. As for who makes the machines.... I'll just quote CNBC: “ASML is the only company making the $200 million machines needed to print every advanced microchip. In the southern Dutch town of Veldhoven, near the border with Belgium, sits the only factory capable of assembling a revolutionary machine that's relied upon by the world's biggest chipmakers.” Which is nuts. I don't know why that is though, as I've never been compelled enough to find out why... Though that article might actually say it, if you felt so inclined. Just Google search that first sentence and it'll come up. _[apologies for any condescension that this post may convey; I've always made it a point to over-explain anything that's public (these comments), both because I don't know your level of understanding but ALSO just incase anyone else reading it who isn't in the know, they can then have a chance following along. 🥴] _
    1
  9. 1
  10.  @assertivekarma1909  Thanks. 😄 I do indeed consider myself open minded and to be an outside-the-box thinker. And I totally agree, it's a shame we have created societies that only/primarily foster dominating others, instead of cooperation. Where one who creates a successful company will muscle out everyone, or, outright purchase them so to remove them as competition... Even when that doesn't result in monopolies, I still find it likely will often lead to more bad, than good. Chiefly, when these amalgams decide to sit on the tech they now own, instead of making use of it (this has happened a few times over the decades just in consumer computer hardware!), sometimes out of corporate spite and others likely due to an unwillingness to bother incorporating or integrating it into their product stack. That alone robs everyone from this that "could have been" but now never will... After all, if it was something that wasn't in some way novel then A) the startup wouldn't have existed, B) they wouldn't have gotten far enough to have garnered public or private interest, and most importantly C) you [said purchasing company] wouldn't have felt threatened enough to purchase them to begin with! So why squirrel it away afterwards... 😞 Even the times when the acquisition IS to utilize whatever-it-is, sometimes it could be for the better such as by now giving the team not just a larger budget, but also access to a wide range of other information/patented goodies; whereas other times it could end up harming the team on account of their need to adhere to a new management system. Therefore, acquiring other companies, frivolously OR sincerely, could actually hinder progress and prevent the team from achieving the greatness they could have, had they remained independent.... Anyways, my point here is that everything sadly revolves around money, and we have become a greedy species, too willing to put principles or morals aside in order to make a buck... Which I feel this, often cutthroat, society we live in has and will cause a great many to take up the mindset of: _"Why should I even bother?" In turn, some people who could lend a genuine hand in improving something or offering ideas to make everything better, won't bother to. Others who want to, will not be able, due to there being no open-mindedness by governments, companies, whatever, to allow people to submit their ideas (and then if it bears fruit, in any way, they see some return for it if desired). In the immortal words of the Looney Toons "French Skunk" character Pepe le Pew_: _"*Le SIGH*" 😞 (In fairness where it's due, I think the Ukraine gov't has indeed been leveraging their people as a "think tank", and that's likely to thanks for why they've done so well with their drone initiatives and designs. Hopefully that can continue, even after they've won!) 🍻✌️
    1