搞钱树
11-27
$谷歌(GOOG)$
Google TPU在特定的场景下,例如超大规模模型训练(万卡)和推理上,能效比超过了同等算力的英伟达GPU,能效比应该至少提升30%,也有说更高的。
英伟达vs谷歌!12月市场情绪会倾向于哪边?
甲骨文等再贷380亿美元,“OpenAI链”数据中心圈子累计负债已达1000亿美元。11月,英伟达累计跌12%,甲骨文累计跌23%,而“AI新王”谷歌累涨超14%。【算力虽然没有来到“零和”的阶段,但“GPU vs TPU”的市场分歧还是很大。大家觉得12月,市场情绪会回到“英伟达链”吗?现在两边,你的操作策略是?】
免责声明:上述内容仅代表发帖人个人观点,不构成本平台的任何投资建议。
分享至
微信
复制链接
精彩评论
我们需要你的真知灼见来填补这片空白
打开APP,发表看法
APP内打开
发表看法
{"i18n":{"language":"zh_CN"},"detailType":1,"isChannel":false,"data":{"magic":2,"id":504643721031848,"tweetId":"504643721031848","gmtCreate":1764222129293,"gmtModify":1764222132060,"author":{"id":9000000000000221,"idStr":"9000000000000221","authorId":9000000000000221,"authorIdStr":"9000000000000221","name":"搞钱树","avatar":"https://static.tigerbbs.com/82edc4968feb7c0c481f5ad38d9a498b","vip":8,"userType":8,"introduction":"运营账号","boolIsFan":false,"boolIsHead":false,"crmLevel":1,"crmLevelSwitch":0,"individualDisplayBadges":[],"fanSize":204,"starInvestorFlag":false},"themes":[{"themeId":"1b40c904088742d383556bf254d6ef7d","categoryId":"d316d8bd4a714fab9ea15fd7157c45e7","name":"英伟达vs谷歌!12月市场情绪会倾向于哪边?","type":0,"rnLink":"https://laohu8.com/RN?name=RNTheme&page=/theme/detail&rndata={\"themeId\":1b40c904088742d383556bf254d6ef7d}&rnconfig={\"headerBarHidden\": true}","description":"甲骨文等再贷380亿美元,“OpenAI链”数据中心圈子累计负债已达1000亿美元。11月,英伟达累计跌12%,甲骨文累计跌23%,而“AI新王”谷歌累涨超14%。【算力虽然没有来到“零和”的阶段,但“GPU vs TPU”的市场分歧还是很大。大家觉得12月,市场情绪会回到“英伟达链”吗?现在两边,你的操作策略是?】","image":"https://static.tigerbbs.com/0e4a1027c0676517ae089bbe9f702d0a"}],"images":[],"coverImages":[],"html":"<html><head></head><body><p style=\"text-align: left;\"> <a data-mention-id=\"GOOG\" rel=\"noopener noreferrer\" class=\"teditor-mention\" data-mention-name=\"谷歌\" target=\"_blank\" href=\"https://laohu8.com/S/GOOG\">$谷歌(GOOG)$</a><span>Google TPU在特定的场景下,例如超大规模模型训练(万卡)和推理上,能效比超过了同等算力的英伟达GPU,能效比应该至少提升30%,也有说更高的。</span></p></body></html>","htmlText":"<html><head></head><body><p style=\"text-align: left;\"> <a data-mention-id=\"GOOG\" rel=\"noopener noreferrer\" class=\"teditor-mention\" data-mention-name=\"谷歌\" target=\"_blank\" href=\"https://laohu8.com/S/GOOG\">$谷歌(GOOG)$</a><span>Google TPU在特定的场景下,例如超大规模模型训练(万卡)和推理上,能效比超过了同等算力的英伟达GPU,能效比应该至少提升30%,也有说更高的。</span></p></body></html>","text":"$谷歌(GOOG)$Google TPU在特定的场景下,例如超大规模模型训练(万卡)和推理上,能效比超过了同等算力的英伟达GPU,能效比应该至少提升30%,也有说更高的。","highlighted":1,"essential":1,"paper":1,"likeSize":0,"commentSize":0,"repostSize":0,"favoriteSize":0,"link":"https://laohu8.com/post/504643721031848","repostId":0,"isVote":1,"tweetType":1,"viewCount":830,"commentLimit":10,"likeStatus":false,"favoriteStatus":false,"reportStatus":false,"symbols":["GOOG"],"verified":2,"subType":0,"readableState":1,"langContent":"CN","currentLanguage":"CN","warmUpFlag":false,"orderFlag":false,"shareable":true,"causeOfNotShareable":"","featuresForAnalytics":[],"commentAndTweetFlag":true,"andRepostAutoSelectedFlag":false,"upFlag":false,"length":138,"optionInvolvedFlag":false,"xxTargetLangEnum":"ZH_CN"},"commentList":[],"isCommentEnd":true,"isTiger":false,"isWeiXinMini":false,"url":"/m/post/504643721031848"}
精彩评论