浪潮上的小舢板
10-07
绝地反弹!涨!!!
亚马逊FAR团队发布人形机器人OmniRetarget
免责声明:上述内容仅代表发帖人个人观点,不构成本平台的任何投资建议。
分享至
微信
复制链接
精彩评论
我们需要你的真知灼见来填补这片空白
打开APP,发表看法
APP内打开
发表看法
1
{"i18n":{"language":"zh_CN"},"detailType":1,"isChannel":false,"data":{"magic":2,"id":486504951296128,"tweetId":"486504951296128","gmtCreate":1759804167509,"gmtModify":1759804169670,"author":{"id":3464650551454893,"idStr":"3464650551454893","authorId":3464650551454893,"authorIdStr":"3464650551454893","name":"浪潮上的小舢板","avatar":"https://static.tigerbbs.com/90ae5f783ea9dfe1ae2a68a18654114a","vip":1,"userType":1,"introduction":"","boolIsFan":false,"boolIsHead":false,"crmLevel":9,"crmLevelSwitch":1,"currentWearingBadge":{"badgeId":"35ec162348d5460f88c959321e554969-1","templateUuid":"35ec162348d5460f88c959321e554969","name":"精英交易员","description":"证券或期货账户累计交易次数达到30次","bigImgUrl":"https://static.tigerbbs.com/ab0f87127c854ce3191a752d57b46edc","smallImgUrl":"https://static.tigerbbs.com/c9835ce48b8c8743566d344ac7a7ba8c","grayImgUrl":"https://static.tigerbbs.com/76754b53ce7a90019f132c1d2fbc698f","redirectLinkEnabled":0,"hasAllocated":1,"isWearing":1,"stampPosition":0,"hasStamp":0,"allocationCount":1,"allocatedDate":"2024.04.30","exceedPercentage":"60.73%","individualDisplayEnabled":0},"individualDisplayBadges":[],"fanSize":20,"starInvestorFlag":false},"themes":[],"images":[],"coverImages":[],"html":"<html><head></head><body><p>绝地反弹!涨!!!</p></body></html>","htmlText":"<html><head></head><body><p>绝地反弹!涨!!!</p></body></html>","text":"绝地反弹!涨!!!","highlighted":1,"essential":1,"paper":1,"likeSize":1,"commentSize":0,"repostSize":0,"favoriteSize":0,"link":"https://laohu8.com/post/486504951296128","repostId":2573620175,"repostType":2,"repost":{"id":"2573620175","kind":"news","pubTimestamp":1759739283,"share":"https://www.laohu8.com/m/news/2573620175?lang=&edition=full","pubTime":"2025-10-06 16:28","market":"us","language":"zh","title":"亚马逊FAR团队发布人形机器人OmniRetarget","url":"https://stock-news.laohu8.com/highlight/detail?id=2573620175","media":"DoNews","summary":"亚马逊机器人团队FAR发布了其首个人形机器人研究成果OmniRetarget,实现无需感知单元的“盲眼”运动,在30秒内完成搬椅、爬桌、翻跟头等复杂动作。实验显示,搭载OmniRetarget的宇树G1成功执行持续30秒的多阶段跑酷任务。与PHC、GMR、VideoMimic等基线相比,OmniRetarget在穿透、脚部打滑和接触保留指标上整体更优,且轻微穿透可由强化学习修复。下游策略评估表明,OmniRetarget在所有任务中成功率领先基线10%以上,表现更稳定。该成果为人形机器人领域首次公开尝试。","content":"<html><body><article><p>亚马逊机器人团队FAR(Frontier AI for Robotics)发布了其首个人形机器人研究成果OmniRetarget,实现无需感知单元的“盲眼”运动,在30秒内完成搬椅、爬桌、翻跟头等复杂动作。</p><p>OmniRetarget是一个开源数据生成引擎,通过交互网格(interaction mesh)建模机器人、物体与地形间的空间和接触关系,将人类演示转化为适用于人形机器人的高质量运动学参考,支持全身控制中的“移-操一体”(loco-manipulation)技能,并实现从仿真到硬件的零样本迁移。</p><p>交互网格被定义为体积结构,顶点包括关键关节及物体与环境采样点,通过德劳内四面体化构建。算法最小化源动作与目标动作间的<a href=\"https://laohu8.com/S/688726\">拉普拉斯</a>形变能,以保持相对空间结构和接触关系。每帧通过求解约束非凸优化问题获得机器人配置,满足碰撞避免、关节限制和防滑等硬约束,采用顺序二次规划风格方法保证时间连续性和平滑性。</p><p>系统通过参数化改变物体配置、形状或地形特征,将单个人类演示扩展为多样化轨迹。在物体交互中,增强位姿与平移并固定下半身至标称轨迹,使上半身探索新协调方式;在地形交互中,调整平台高度与深度并引入额外约束生成多场景。强化学习用于弥补动力学差异,训练低层策略将运动学轨迹转化为物理可执行动作。</p><p>机器人仅依赖本体感知和参考轨迹作为先验知识,输入包括参考关节位置/速度、骨盆误差、骨盆线/角速度、关节状态及先前动作。奖励函数包含身体跟踪、物体跟踪、动作速率、软关节限制和自碰撞五类,并结合领域随机化提升泛化能力。相似动作分组训练以加速收敛,不同任务采用独立策略设置。</p><p>实验显示,搭载OmniRetarget的宇树G1成功执行持续30秒的多阶段跑酷任务。完整增强数据集上的训练成功率为79.1%,接近仅用标称动作的82.2%,表明增强显著扩展动作覆盖范围而不明显降低性能。与PHC、GMR、VideoMimic等基线相比,OmniRetarget在穿透、脚部打滑和接触保留指标上整体更优,且轻微穿透可由强化学习修复。</p><p>下游策略评估表明,OmniRetarget在所有任务中成功率领先基线10%以上,表现更稳定。实验基于OMOMO、内部MoCap和LAFAN1数据集进行。<a href=\"https://laohu8.com/S/AMZN\">Amazon</a> FAR成立七个多月,由前Covariant团队组建,核心成员包括Pieter Abbeel、Peter Chen、Rocky Duan和Tianhao Zhang,其中Rocky Duan任研究负责人。该成果为人形机器人领域首次公开尝试。</p></article></body></html>","source":"tencent","collect":0,"html":"<!DOCTYPE html>\n<html>\n<head>\n<meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\" />\n<meta name=\"viewport\" content=\"width=device-width,initial-scale=1.0,minimum-scale=1.0,maximum-scale=1.0,user-scalable=no\"/>\n<meta name=\"format-detection\" content=\"telephone=no,email=no,address=no\" />\n<title>亚马逊FAR团队发布人形机器人OmniRetarget</title>\n<style type=\"text/css\">\na,abbr,acronym,address,applet,article,aside,audio,b,big,blockquote,body,canvas,caption,center,cite,code,dd,del,details,dfn,div,dl,dt,\nem,embed,fieldset,figcaption,figure,footer,form,h1,h2,h3,h4,h5,h6,header,hgroup,html,i,iframe,img,ins,kbd,label,legend,li,mark,menu,nav,\nobject,ol,output,p,pre,q,ruby,s,samp,section,small,span,strike,strong,sub,summary,sup,table,tbody,td,tfoot,th,thead,time,tr,tt,u,ul,var,video{ font:inherit;margin:0;padding:0;vertical-align:baseline;border:0 }\nbody{ font-size:16px; line-height:1.5; color:#999; background:transparent; }\n.wrapper{ overflow:hidden;word-break:break-all;padding:10px; }\nh1,h2{ font-weight:normal; line-height:1.35; margin-bottom:.6em; }\nh3,h4,h5,h6{ line-height:1.35; margin-bottom:1em; }\nh1{ font-size:24px; }\nh2{ font-size:20px; }\nh3{ font-size:18px; }\nh4{ font-size:16px; }\nh5{ font-size:14px; }\nh6{ font-size:12px; }\np,ul,ol,blockquote,dl,table{ margin:1.2em 0; }\nul,ol{ margin-left:2em; }\nul{ list-style:disc; }\nol{ list-style:decimal; }\nli,li p{ margin:10px 0;}\nimg{ max-width:100%;display:block;margin:0 auto 1em; }\nblockquote{ color:#B5B2B1; border-left:3px solid #aaa; padding:1em; }\nstrong,b{font-weight:bold;}\nem,i{font-style:italic;}\ntable{ width:100%;border-collapse:collapse;border-spacing:1px;margin:1em 0;font-size:.9em; }\nth,td{ padding:5px;text-align:left;border:1px solid #aaa; }\nth{ font-weight:bold;background:#5d5d5d; }\n.symbol-link{font-weight:bold;}\n/* header{ border-bottom:1px solid #494756; } */\n.title{ margin:0 0 8px;line-height:1.3;color:#ddd; }\n.meta {color:#5e5c6d;font-size:13px;margin:0 0 .5em; }\na{text-decoration:none; color:#2a4b87;}\n.meta .head { display: inline-block; overflow: hidden}\n.head .h-thumb { width: 30px; height: 30px; margin: 0; padding: 0; border-radius: 50%; float: left;}\n.head .h-content { margin: 0; padding: 0 0 0 9px; float: left;}\n.head .h-name {font-size: 13px; color: #eee; margin: 0;}\n.head .h-time {font-size: 11px; color: #7E829C; margin: 0;line-height: 11px;}\n.small {font-size: 12.5px; display: inline-block; transform: scale(0.9); -webkit-transform: scale(0.9); transform-origin: left; -webkit-transform-origin: left;}\n.smaller {font-size: 12.5px; display: inline-block; transform: scale(0.8); -webkit-transform: scale(0.8); transform-origin: left; -webkit-transform-origin: left;}\n.bt-text {font-size: 12px;margin: 1.5em 0 0 0}\n.bt-text p {margin: 0}\n</style>\n</head>\n<body>\n<div class=\"wrapper\">\n<header>\n<h2 class=\"title\">\n亚马逊FAR团队发布人形机器人OmniRetarget\n</h2>\n\n<h4 class=\"meta\">\n\n\n2025-10-06 16:28 北京时间 <a href=http://gu.qq.com/resources/shy/news/detail-v2/index.html#/?id=nesSN2025100616320994bc96d9&s=b><strong>DoNews</strong></a>\n\n\n</h4>\n\n</header>\n<article>\n<div>\n<p>亚马逊机器人团队FAR(Frontier AI for Robotics)发布了其首个人形机器人研究成果OmniRetarget,实现无需感知单元的“盲眼”运动,在30秒内完成搬椅、爬桌、翻跟头等复杂动作。OmniRetarget是一个开源数据生成引擎,通过交互网格(interaction mesh)建模机器人、物体与地形间的空间和接触关系,将人类演示转化为适用于人形机器人的高质量运动学参考,支持...</p>\n\n<a href=\"http://gu.qq.com/resources/shy/news/detail-v2/index.html#/?id=nesSN2025100616320994bc96d9&s=b\">Web Link</a>\n\n</div>\n\n\n</article>\n</div>\n</body>\n</html>\n","type":0,"thumbnail":"","relate_stocks":{"BK4532":"文艺复兴科技持仓","LU1232071149.USD":"AZ FUND 1 GLOBAL GROWTH SELECTOR \"AAZ\" (USDHDG) ACC","LU2023250330.USD":"ALLIANZ INCOME AND GROWTH \"AMG\" (USD) INC","LU2362541513.USD":"WELLINGTON NEXT GENERATION GLOBAL EQUITY \"A\" (USD) ACC","LU1989764664.SGD":"CPR Invest - Global Disruptive Opportunities A2 Acc SGD-H","LU2028103732.USD":"ALLIANZ GLOBAL SUSTAINABILITY \"AMG\" (USD) INC","LU0158827781.USD":" ALLIANZ GLOBAL SUSTAINABILITY \"AT\" (USD) ACC","LU0494093205.USD":"贝莱德ESG灵活多元资产A2 USD-H","LU1914381329.SGD":"Allianz Best Styles Global Equity Cl ET Acc H2-SGD","LU0672654240.SGD":"FTIF - Franklin US Opportunities A Acc SGD-H1","SG9999002232.USD":"Allianz Global High Payout USD","SG9999018865.SGD":"United Global Quality Growth Fd Cl Dist SGD-H","LU0719512351.SGD":"JPMorgan Funds - US Technology A (acc) SGD","LU0820561818.USD":"安联收益及增长平衡基金Cl AM DIS","LU2242650005.HKD":"FIDELITY FUNDS GLOBAL MULTI ASSET DYNAMIC \"A\" (HKD) ACC","BK4566":"资本集团","LU0154236417.USD":"BGF US FLEXIBLE EQUITY \"A2\" ACC","LU2452424414.USD":"BGF ESG MULTI-ASSET \"A10\" (USDHDG) INC","AMZN":"亚马逊","LU0965509101.SGD":"AB LOW VOLATILITY EQUITY PORTFOLIO \"A\" (SGDHDG) ACC","LU0130102774.USD":"Natixis Harris Associates US Equity RA USD","IE00B7KXQ091.USD":"Janus Henderson Balanced A Inc USD","BK4592":"伊斯兰概念","LU2211815571.USD":"ALLIANZ POSITIVE CHANGE \"AT\" (USD) ACC","LU2403377893.USD":"ALLIANZ SELECT INCOME AND GROWTH \"AM\" (USD) INC","IE00B4JS1V06.HKD":"JANUS HENDERSON BALANCED \"A2\" (HKD) ACC","SGXZ51526630.SGD":"大华环球创新基金A Acc SGD","LU0980610538.SGD":"Natixis Harris Associates US Equity RA SGD-H","SG9999018857.SGD":"United Global Quality Growth Fd Cl Acc SGD-H","BK4538":"云计算","IE00BFSS7M15.SGD":"Janus Henderson Balanced A Acc SGD-H","BK4579":"人工智能","LU1046421795.USD":"富达环球科技A-ACC","SG9999004303.SGD":"Nikko AM Shenton Global Opportunities SGD","LU0109392836.USD":"富兰克林科技股A","LU2125154935.USD":"ALLSPRING (LUX) WF GLOBAL EQUITY ENHANCED INCOME \"I\" (USD) INC","IE00B19Z9505.USD":"美盛-美国大盘成长股A Acc","LU1633808545.USD":"ALLIANZ GLOBAL EQUITY GROWTH \"AT\" (USD) ACC","LU1069344957.HKD":"AB SICAV I - AMERICAN GROWTH PORTFOLIO \"AD\" (HKD) INC","LU2108987350.USD":"UBS (LUX) EQUITY SICAV GLOBAL OPPORTUNITY SUSTAINABLE (USD) \"P\" (USD) ACC","LU1861558580.USD":"日兴方舟颠覆性创新基金B","IE00BKPKM429.USD":"NEUBERGER BERMAN GLOBAL SUSTAINABLE EQUITY \"A\" (USD) ACC","IE0005OL40V9.USD":"JANUS HENDERSON BALANCED \"A6M\" (USD) INC","BK4551":"寇图资本持仓","LU2430703178.SGD":"WELLINGTON MULTI-ASSET HIGH INCOME \"AM4H\" (SGDHDG) INC","LU0203202063.USD":"AB SICAV I - ALL MARKET INCOME PORTFOLIO \"A2X\" (USD) ACC","IE00BZ1G4Q59.USD":"LEGG MASON CLEARBRIDGE US EQUITY SUSTAINABILITY LEADER \"A\"(USD) INC (A)","USAW.SI":"AMZN 3xLongSG261006","LU2077746001.SGD":"Blackrock ESG Multi-Asset A2 SGD-H","BK4524":"宅经济概念"},"source_url":"http://gu.qq.com/resources/shy/news/detail-v2/index.html#/?id=nesSN2025100616320994bc96d9&s=b","is_english":false,"share_image_url":"https://static.laohu8.com/9a95c1376e76363c1401fee7d3717173","article_id":"2573620175","content_text":"亚马逊机器人团队FAR(Frontier AI for Robotics)发布了其首个人形机器人研究成果OmniRetarget,实现无需感知单元的“盲眼”运动,在30秒内完成搬椅、爬桌、翻跟头等复杂动作。OmniRetarget是一个开源数据生成引擎,通过交互网格(interaction mesh)建模机器人、物体与地形间的空间和接触关系,将人类演示转化为适用于人形机器人的高质量运动学参考,支持全身控制中的“移-操一体”(loco-manipulation)技能,并实现从仿真到硬件的零样本迁移。交互网格被定义为体积结构,顶点包括关键关节及物体与环境采样点,通过德劳内四面体化构建。算法最小化源动作与目标动作间的拉普拉斯形变能,以保持相对空间结构和接触关系。每帧通过求解约束非凸优化问题获得机器人配置,满足碰撞避免、关节限制和防滑等硬约束,采用顺序二次规划风格方法保证时间连续性和平滑性。系统通过参数化改变物体配置、形状或地形特征,将单个人类演示扩展为多样化轨迹。在物体交互中,增强位姿与平移并固定下半身至标称轨迹,使上半身探索新协调方式;在地形交互中,调整平台高度与深度并引入额外约束生成多场景。强化学习用于弥补动力学差异,训练低层策略将运动学轨迹转化为物理可执行动作。机器人仅依赖本体感知和参考轨迹作为先验知识,输入包括参考关节位置/速度、骨盆误差、骨盆线/角速度、关节状态及先前动作。奖励函数包含身体跟踪、物体跟踪、动作速率、软关节限制和自碰撞五类,并结合领域随机化提升泛化能力。相似动作分组训练以加速收敛,不同任务采用独立策略设置。实验显示,搭载OmniRetarget的宇树G1成功执行持续30秒的多阶段跑酷任务。完整增强数据集上的训练成功率为79.1%,接近仅用标称动作的82.2%,表明增强显著扩展动作覆盖范围而不明显降低性能。与PHC、GMR、VideoMimic等基线相比,OmniRetarget在穿透、脚部打滑和接触保留指标上整体更优,且轻微穿透可由强化学习修复。下游策略评估表明,OmniRetarget在所有任务中成功率领先基线10%以上,表现更稳定。实验基于OMOMO、内部MoCap和LAFAN1数据集进行。Amazon FAR成立七个多月,由前Covariant团队组建,核心成员包括Pieter Abbeel、Peter Chen、Rocky Duan和Tianhao Zhang,其中Rocky Duan任研究负责人。该成果为人形机器人领域首次公开尝试。","news_type":1,"symbols_score_info":{"AMZN":0.9,"USAW.SI":0.6}},"isVote":1,"tweetType":1,"viewCount":24,"commentLimit":10,"likeStatus":false,"favoriteStatus":false,"reportStatus":false,"symbols":[],"verified":2,"subType":0,"readableState":1,"langContent":"CN","currentLanguage":"CN","warmUpFlag":false,"orderFlag":false,"shareable":true,"causeOfNotShareable":"","featuresForAnalytics":[],"commentAndTweetFlag":false,"andRepostAutoSelectedFlag":false,"upFlag":false,"length":14,"optionInvolvedFlag":false,"xxTargetLangEnum":"ZH_CN"},"commentList":[],"isCommentEnd":true,"isTiger":false,"isWeiXinMini":false,"url":"/m/post/486504951296128"}
精彩评论