Comments by "Scott Franco" (@scottfranco1962) on "Asianometry"
channel.
-
4
-
4
-
@bunyu6237 A couple of reasons (better software than hardware). First of all, there is a larger group of people working on software than hardware, so the jobs are more plentiful and the demand greater. Second, hardware/software crossover people are considered odd birds, and when I used to do that I had people literally telling me to "pick a side", go one way or the other. I find it easier to get and do software projects, and the pay is better. I dabbled in Verilog long after I stopped being paid for hardware design, and I realized it would take a lot of work to get a foothold in good Verilog design with virtually no corresponding increase in salary, and more likely a decrease for a while during the time I gain credibility as a Verilog designer. The last time I was paid to design hardware it was still schematic entry (and yes, in case you haven't figured that out, I am indeed that old).
Of course, a lot of this is my personal situation. I am not sure any of the above would serve as career advice. I definitely consider my hardware background to be a career asset, since specialize low level software design (drivers, embedded, etc). Having said that, I keep up with hardware advances and have often dreamed of uniting my Verilog experience with software experience. That dream is unrealized.
4
-
Good job. There was a bit of conflation there with microcode (its firmware?). It would have helped to underline that it is entirely internal to the chip and operates the internals of the CPU. In any case, microcode was discarded with the Pentium series, KINDA. It actually lives on today in so called "slow path" instructions like block moves in the later cpus, which use microcode because nobody cares if they run super fast or not, since they are generally only used for backwards compatibility and got deprecated in 64 bit mode.
I await the second half of this! Things took off again with the AMD64 and the "multicore wars". Despite the mess, the entire outcome probably could have been predicted on sheer economic grounds, that is, the market settling into a #1 and #2 player with small also-rans. Today's desktop market, at least, remains in the hands of the x86 makers except for the odd story of Apple and the M series chips. Many have pronounced the end of the desktop, but it lives on. I have many or even most colleges who use Apple macs as their preferred development machines, but, as I write this, I am looking out at a sea of x86 desktop machines. Its rare to see a mac desktop, and in any case, at this moment even the ubiquitous Mac pro laptops the trendsetters love are still x86 based, although I assume that will change soon.
Me? Sorry, x86 everything, desktop and laptop(s). At last count I have 5 machines running around my house and office and 4 laptops. I keep buying Mac laptops and desktops, cause, you know, gotta keep up with things, but they grow obsolete faster than a warm banana. Yes, I had power PC Macs, and yes they ended up in the trash. And yes, I will probably buy Mac M2s at some point.
4
-
4
-
4
-
3
-
3
-
3
-
You have tapped into a classic boondoggle here. When I was with Cisco around 2000, the big "new wave" was about all optical switching, which was supposed to be faster than converting optical to electronic and back again, often with DMMs (Digital Micro Mirrors). Howed that work out? The startup world was littered with smoking holes of failed companies.
I think the bottom line is we know a lot about devices that operate on electrical signals, but not so much about devices that work on pure light. As in everyone knows what an electrical nand gate is, and optical nand gates are possible with optical signals, but good luck getting that to work, be integrated at high densities, and be efficient. Lets start with the basics. You can route signals easily in electronics, and the 10+ layers of interconnect on current ICs talk to this. What's a light conductor on an IC? Well, air, which should be free, but is far from it. You would have to couple in and out of the IC at many points, which is expensive in terms of real estate. You could conduct with glass, and that is a whole 'nuther level.
I'm not saying never. I'm just saying that with any breathless new wave of technology you have to look at history and see if that wave has not broken previously, or like every 5 years or so (cough.... AI).
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
3
-
2
-
2
-
@ttb1513 The last company I work at silicon design for, Seagate, was emblematic of the basic problems with ASICs. It was 1995 and they were new at ASIC design in house. And they weren't very good at it. It was random logic design before Verilog. An example of this was our timing verification. We had timing chains that were too long for the cycle time, and thus they simply alllowed for the fact that the signal would be sampled in the next clock cycle. Now if you were an ASIC designer back then, what I just said would have made you reach for the tums, if not a 911 call for cardiac arrest. Its an open invitation to metastability. And indeed, our AT&T fab guys were screaming at us to stop that. I got put in charge of hardware simulation for the design, and I have detailed this fiasco in these threads before, so won't go over it again.
The bottom line was that ASIC process vendors were loosing trust in their customers to perform verification. The answer was that they included test chains in the designs that would automatically verify the designs at the silicon level. It mean that the silicon manufactured design would be verified, that is, defects on the chip would be verified regardless of what the design did. My boss, who with the freedom of time I can now certify was an idiot, was ecstatic over this new service. It was a gonna fixa all o' de problems don't ya know? I pointed out to him, pointlessly I might add, that our design could be total cow shit and still pass these tests with flying colors. It was like talking to a wall.
In any case, the entire industry went that way. Designs are easier to verify now that the vast majority of designs are in Verilog. I moved on to software only, but I can happily say that there are some stunning software verification suites out there, and I am currently working on one, so here we are.
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
2
-
@AlexanderSylchuk Oh, you are begging for my favorite story. I interviewed with a company that made precision flow valves. These were mechanical nightmares of high precision that accurately measured things like gas flow in chemical processes. This is like half the chemical industry (did you know a lot of chemical processes use natural gas as their feed stock?). Anyways, what has that got to do with this poor programmer? Well, like most industries they were computerizing. They had a new product that used a "bang bang" valve run by a microprocessor. A bang bang valve is a short piston that is driven by a solenoid that when not energized, is retracted by a spring and opens a intake port and lets a small amount of gas into a chamber. then the solenoid energizes, pushes the piston up and the gas out another port. Each time the solenoid activates, a small amount of gas is moved along. Hence the "bang bang" part. If you want to find one in your house, look at your refrigerator. Its how the Freon compressor in it works.
Ok, well, that amount of gas is not very accurately measured no matter how carefully you machine the mechanism. But, it turns out to be "self accurate", that is, whatever the amount of gas IS that is moved, it is always the same. The company, which had got quite rich selling their precision valves, figured they could produce a much cheaper unit that used the bang bang valve. So they ginned it up, put a compensation table in it so the microprocessor could convert gas flows to bang bang counts, and voila! ici la produit! It worked. Time to present it to the CEO! The CEO asks the engineers "just how accurate is it?" Engineer says:
well... actually it is more accurate than our precision valves. And for far cheaper.
The story as told me didn't include just how many drinks the CEO needed that night.
So the CEO, realizing that he had seen the future, immediately set into motion a plan to obsolete their old, expensive units and make the newer, more accurate and cheaper computerized gas flow valves.
Ha ha, just kidding. He told the engineers to program the damm thing to be less accurate so that it wouldn't touch their existing business.
Now they didn't hire me. Actually long story, they gave me a personality test that started with something like "did you love your mother", I told them exactly where, in what direction, and how much force they could use to put their test and walked out.
I didn't follow up on what happened, mainly because I find gas flow mechanics to be slightly less interesting than processing tax returns. But I think if I went back there, I would have found a smoking hole where the company used to be.
And that is the (very much overly long) answer to your well meaning response.
2
-
2