Youtube comments of (@bycloudAI).

  1. 2300
  2. 1600
  3. 680
  4. 615
  5. 557
  6. 431
  7. 401
  8. 383
  9. 383
  10. 378
  11. 252
  12. 192
  13. 155
  14. 142
  15. 116
  16. 111
  17. 81
  18. 75
  19. 73
  20. 73
  21. 69
  22. 68
  23. 68
  24. 66
  25. 65
  26. 64
  27. 62
  28. 61
  29. 60
  30. 59
  31. 58
  32. 55
  33. 54
  34. 54
  35. 52
  36. 51
  37. 49
  38. 48
  39. 48
  40. 48
  41. 48
  42. 47
  43. 46
  44. 45
  45. 44
  46. 44
  47. 43
  48. 43
  49. 42
  50. 41
  51. 40
  52. 40
  53. 40
  54. 39
  55. 39
  56. 38
  57. 36
  58. 36
  59. 36
  60. 33
  61. 32
  62. 32
  63. 31
  64. 30
  65. 29
  66. 29
  67. 29
  68. 29
  69. 29
  70. 28
  71. 28
  72. 28
  73. 28
  74. 27
  75. Check out NVIDIA NIM now using this link https://nvda.ws/3Jn5pxb It seems like the general consensus that people are disappointed with Stable Diffusion 3 is because there are so much potential wasted. I definitely think it's a bit unfortunate too... EDIT: Update on new SD3 terms The license was changed and is now free for research and also commercial use up to $1m. - Lykon Also some elaborations by Lykon (key researcher for SD3) 1. 2B was an experiment at scaling down 8B, but it's essentially the same architecture. As Alex said on Discord, it was supposed to be released as "beta", but the label got removed last minute. That was a mistake. 2. 4B is very experimental and, as of today, has roughly the same issues as 2B. It's also using a different architecture and only one text encoder (which is probably the worst of the 3 and the heaviest). 3. 2B was released before 8B because it's easier/cheaper for us to finetune, and is also the perfect size for the community (having roughly the same resource cost of SDXL). 4. The new CEO just changed the license entirely making it free for research purpose, free for commercial use until 1M USD, free for non-commercial use. 5. We are working on a new Medium model to address the issue. 6. The issues with 2b were not intentional. The architecture was made and tested at 8B params and, turns out, MMDiT scales very well at high param count, but has attention issues on low param count. We are addressing this. 7. The 512mar SD3M that Alex showed is not the finetuned one. I just re-tested the one I have today on "that prompt" and it works much better. I'm pushing to release it (even if it's not super useful)
    27
  76. 27
  77. 26
  78. 26
  79. 25
  80. 25
  81. 25
  82. 24
  83. 24
  84. 23
  85. 22
  86. 22
  87. 21
  88. 20
  89. 20
  90. 20
  91. 18
  92. 18
  93. 18
  94. 18
  95. 17
  96. 17
  97. 17
  98. 16
  99. 16
  100. 16
  101. 16
  102. 16
  103. 16
  104. 15
  105. 15
  106. 15
  107. 15
  108. 15
  109. 15
  110. 15
  111. 14
  112. 14
  113. 14
  114. 14
  115. 14
  116. 14
  117. 14
  118. 14
  119. 13
  120. 13
  121. 13
  122. 13
  123. 13
  124. 13
  125. 12
  126. 12
  127. 12
  128. 12
  129. 12
  130. 12
  131. 11
  132. 11
  133. 11
  134. 11
  135. 11
  136. 11
  137. 10
  138. 10
  139. 10
  140. 10
  141. 10
  142. 10
  143. 10
  144. 10
  145. 10
  146. 9
  147. 9
  148. 9
  149. 9
  150. 9
  151. 9
  152. 9
  153. 8
  154. 8
  155. 8
  156. 8
  157. 8
  158. 8
  159. 7
  160. 7
  161. 7
  162. 7
  163. 7
  164. 7
  165. 7
  166. 7
  167. 6
  168. 6
  169. 6
  170. 6
  171. 6
  172. 6
  173. 6
  174. 6
  175. 6
  176. 6
  177. 6
  178. 6
  179. 6
  180. 6
  181. 6
  182. 6
  183. 5
  184. 5
  185. 5
  186. 5
  187. 5
  188. 5
  189. 5
  190. 5
  191. 5
  192. 5
  193. 5
  194. 4
  195. 4
  196. 4
  197. 4
  198. 4
  199. 4
  200. 4
  201. 4
  202. 4
  203. 4
  204. 4
  205. 4
  206. 4
  207. 4
  208. 4
  209. 4
  210. 3
  211. 3
  212. 3
  213. 3
  214. 3
  215. 3
  216. 3
  217. 3
  218. 3
  219. 3
  220. 3
  221. 3
  222. 3
  223. 3
  224. 3
  225. 3
  226. 3
  227. 3
  228. 3
  229. 3
  230. 3
  231. 3
  232. 2
  233. 2
  234. 2
  235. 2
  236. 2
  237. 2
  238. 2
  239. 2
  240. 2
  241. 2
  242. 2
  243. 2
  244. 2
  245. 2
  246. 2
  247. 2
  248. 2
  249. 2
  250. 2
  251. 2
  252. 2
  253. 2
  254. 2
  255. 2
  256. 2
  257. 2
  258. 2
  259. 2
  260. 1
  261. 1
  262. 1
  263. 1
  264. 1
  265. 1
  266. 1
  267. 1
  268. 1
  269. 1
  270. 1
  271. 1
  272. 1
  273. 1
  274. 1
  275. 1
  276. 1
  277. 1
  278. 1
  279. 1
  280. 1
  281. 1
  282. 1
  283. 1
  284. 1
  285. 1
  286. 1
  287. 1
  288. 1
  289. 1
  290. 1
  291. 1
  292. 1
  293. 1
  294. 1
  295. 1