General statistics
List of Youtube channels
Youtube commenter search
Distinguished comments
About
miraculixx
Bloomberg Originals
comments
Comments by "miraculixx" (@miraculixxs) on "Bloomberg Originals" channel.
Previous
1
Next
...
All
"In a gold rush, sell shovels" - that's what this site is. Shovels, not AI.
303
Not unprecedented at all. It's a moat building tactic. Every high tech company ever is doing this. It's called lobbying.
24
"I have not built anything before"
9
Is this supposed to be journalism? Seems more like a PR stunt
6
Sammi Issa I read up on it. the filing actually said 2bn over 5 years to Google. So that's 400m/year. weird considering their rev is at 400m/yr. Essentially all rev goes straight to Google.
4
1bn$ to Amazon?! WTF?! with that kind of money they could build their own top tier data center. I guess this was misquoted
3
Why the shady lights? On second thoughts perhaps quite fitting 😮
3
So far they are just building the buildings. The GPUs may come, or not. Who knows.
3
No doubt great business for Cursoe and the builders. This however is not at all a guarantee for successful use of all of those GPUs. It is highly questionable whether this money will ever be made back. So far nobody is making money from AI infrastructure, except Nvidia and perhaps Azure and AWS because they serve b2b. But its great for consumers and b2b - the supply overhang will drive prices to near ~zero.
1
On what scale?
1
Thank engineers that cell phones have sophisticated noise filters built in and probably TV mics do too. Are Bloomberg viewers assumed to be that low tech? Wow.
1
@pmmm712011 Travelling is actually useful
1
@12:00 actually it is CPUs that work on lots of small tasks at once. GPUs perform only one large task (calculate stuff) at once, but on a lot of data
1
@MAGAjaga-d4p That is not accurate (although a popular misconception). CPUs are multi core, pipelined and multi-threaded at the hardware level to enable task parallism. A single modern CPU can handle and execute hundreds of tasks simultanously, each task handling a very small piece of data at a time. Think many cars on many streets, very little coordination. A modern GPU has 100s to 1000s of cores that are all syncronized on a single task executed on a large array of numbers, there is data parallism, but no task parallelism. Think a few lorrys lined up on a single road. Or better yet, think fright plane: huge load capacity, one destination at a time, lots of coordination.
1
@Brain4Brain Humans are responsible for their answers.
1
Previous
1
Next
...
All