
• 乔纳森·查林杰称,他的Cybertruck汽车在全自动驾驶模式下,径直撞上了路灯。埃隆·马斯克希望这项技术在今年6月份无人驾驶出租车服务启动时能够准备就绪。
一辆特斯拉(Tesla)Cybertruck侧翻在路边,右前轮悬空紧贴在灯柱上,车身严重变形,场面令人触目惊心。车主乔纳森·查林杰于周日在社交平台发布了事故照片,称特斯拉全自动驾驶(FSD)软件在他未注意时突然失控,导致车辆撞向路灯。
尽管查林杰本人毫发无伤,但他警告称其他人可能不会如此幸运。他呼吁:“请转发我的遭遇,帮助更多人避免重蹈覆辙或遭遇更可怕的厄运。”
该帖子迅速获得200万次浏览,并引发关于“FSD是否足够成熟,能否真正实现无人驾驶”的激烈争论。
此时距离特斯拉首席执行官马斯克推出无人驾驶出租车服务的关键节点仅剩不到五个月时间,而这一业务被视为支撑特斯拉1.1万亿美元市值的核心支柱。
据查林杰描述,事故发生时,车辆未能及时驶离即将终止的车道——尽管相邻车道畅通无阻,系统既未尝试减速,也没有转向,直到撞上障碍物。
谷歌(Google)地图及街景画面显示,事故路段的车道布局与查林杰帖子中的照片完全吻合。里诺市警察局向《财富》杂志证实,2月6日确有一名姓查林杰的司机发生了车祸。但在完整报告提交前,警方暂不披露进一步细节。
查林杰在推文中@了马斯克、特斯拉AI总监阿肖克·埃勒斯瓦米、整个AI团队及Cybertruck车辆总工程师韦斯·莫里尔。
特斯拉长期通过FSD系统收集数据,用于训练其自动驾驶系统。过去,对于不实车祸指控,特斯拉会立即予以否认。
截至发稿时,特斯拉尚未回应《财富》杂志的置评请求。《财富》杂志向查林杰发送了置评请求,但未收到回复。
“这是我的严重失误。不要重蹈覆彻”
特斯拉直到去年9月才向Cybertruck开放FSD系统,此时距该车型上市已过去整整10个月。
与特斯拉轿车相比,这款皮卡车身尺寸更大、底盘更高,且配备四轮转向系统,整体工程设计更为复杂。
一位以公正和权威性著称的特斯拉 FSD 测试员证实,查林杰对车祸的描述具有可信度。
查林杰在推文中@的资深测试员查克·库克回应称:“你描述的场景非常普遍。车辆的路线规划程序或提前并道决策,往往会让车辆陷入困境,最终无路可走。你绝不是个例。”
查林杰迅速承认自己的疏忽,并接受因未按特斯拉要求对 FSD 进行监督而需承担的最终责任。
他警告称:“这显然是我的严重失误,不要重蹈覆辙。请保持警惕,事故随时可能发生。”他同时请求将行车记录仪视频提交给特斯拉AI团队分析。
面对“蓄意借6月无人驾驶出租车商业落地前夕、马斯克与特斯拉备受关注之机进行炒作”的指控,查林杰坚决否认。
他表示:“我只想尽可能将数据交给特斯拉。我已经尝试了所有能想到的办法与他们取得联系。”
对事故发生时间的质疑
早在上月初,查林杰就曾在另一条未被广泛关注的推文中承认遭遇严重事故。
当被问及在正面碰撞时,Cybertruck的能量吸收表现时,他于1月初写道:“作为事故亲历者,我可以确认,这款车在撞击时的溃缩吸能表现正常。”
查林杰透露,他已多次尝试向特斯拉提交行车记录仪中的事故视频。
他特别指出该事故发生时使用的是FSD v13.2.4版本。值得注意的是,该软件版本在他发布较早推文约一周后才开始向全体FSD用户推送。
马斯克的过往表现引发外界对 FSD 技术的质疑
距离6月份推出无人驾驶出租车只有几个月时间,但特斯拉首席执行官马斯克尚未公布任何可供独立验证的数据,以证明他所说的FSD已准备好应用在无人监督的全自动驾驶出租车上。
相比之下,Waymo等竞争对手会向州监管机构报告人工接管事件。但特斯拉多年来却一直利用法律漏洞,拒绝这一透明度要求。
马斯克还多次发表与事实不符的言论。
特斯拉的AI总监埃勒斯瓦米在法庭作证时称,马斯克曾指示他修改一条营销视频,以误导消费者对特斯拉FSD能力的认知。
最近,马斯克承认,搭载旧版 AI3 推理计算机的特斯拉车型实际上无法实现他此前宣称的“所有 2016 年后生产的车辆均具备自动驾驶能力”。
他计划为已购买 FSD 的客户提供最新一代硬件升级服务,但具体如何实施以及所需成本仍不明确。(财富中文网)
译者:刘进龙
审校:汪皓
• 乔纳森·查林杰称,他的Cybertruck汽车在全自动驾驶模式下,径直撞上了路灯。埃隆·马斯克希望这项技术在今年6月份无人驾驶出租车服务启动时能够准备就绪。
一辆特斯拉(Tesla)Cybertruck侧翻在路边,右前轮悬空紧贴在灯柱上,车身严重变形,场面令人触目惊心。车主乔纳森·查林杰于周日在社交平台发布了事故照片,称特斯拉全自动驾驶(FSD)软件在他未注意时突然失控,导致车辆撞向路灯。
尽管查林杰本人毫发无伤,但他警告称其他人可能不会如此幸运。他呼吁:“请转发我的遭遇,帮助更多人避免重蹈覆辙或遭遇更可怕的厄运。”
该帖子迅速获得200万次浏览,并引发关于“FSD是否足够成熟,能否真正实现无人驾驶”的激烈争论。
此时距离特斯拉首席执行官马斯克推出无人驾驶出租车服务的关键节点仅剩不到五个月时间,而这一业务被视为支撑特斯拉1.1万亿美元市值的核心支柱。
据查林杰描述,事故发生时,车辆未能及时驶离即将终止的车道——尽管相邻车道畅通无阻,系统既未尝试减速,也没有转向,直到撞上障碍物。
谷歌(Google)地图及街景画面显示,事故路段的车道布局与查林杰帖子中的照片完全吻合。里诺市警察局向《财富》杂志证实,2月6日确有一名姓查林杰的司机发生了车祸。但在完整报告提交前,警方暂不披露进一步细节。
查林杰在推文中@了马斯克、特斯拉AI总监阿肖克·埃勒斯瓦米、整个AI团队及Cybertruck车辆总工程师韦斯·莫里尔。
特斯拉长期通过FSD系统收集数据,用于训练其自动驾驶系统。过去,对于不实车祸指控,特斯拉会立即予以否认。
截至发稿时,特斯拉尚未回应《财富》杂志的置评请求。《财富》杂志向查林杰发送了置评请求,但未收到回复。
“这是我的严重失误。不要重蹈覆彻”
特斯拉直到去年9月才向Cybertruck开放FSD系统,此时距该车型上市已过去整整10个月。
与特斯拉轿车相比,这款皮卡车身尺寸更大、底盘更高,且配备四轮转向系统,整体工程设计更为复杂。
一位以公正和权威性著称的特斯拉 FSD 测试员证实,查林杰对车祸的描述具有可信度。
查林杰在推文中@的资深测试员查克·库克回应称:“你描述的场景非常普遍。车辆的路线规划程序或提前并道决策,往往会让车辆陷入困境,最终无路可走。你绝不是个例。”
查林杰迅速承认自己的疏忽,并接受因未按特斯拉要求对 FSD 进行监督而需承担的最终责任。
他警告称:“这显然是我的严重失误,不要重蹈覆辙。请保持警惕,事故随时可能发生。”他同时请求将行车记录仪视频提交给特斯拉AI团队分析。
面对“蓄意借6月无人驾驶出租车商业落地前夕、马斯克与特斯拉备受关注之机进行炒作”的指控,查林杰坚决否认。
他表示:“我只想尽可能将数据交给特斯拉。我已经尝试了所有能想到的办法与他们取得联系。”
对事故发生时间的质疑
早在上月初,查林杰就曾在另一条未被广泛关注的推文中承认遭遇严重事故。
当被问及在正面碰撞时,Cybertruck的能量吸收表现时,他于1月初写道:“作为事故亲历者,我可以确认,这款车在撞击时的溃缩吸能表现正常。”
查林杰透露,他已多次尝试向特斯拉提交行车记录仪中的事故视频。
他特别指出该事故发生时使用的是FSD v13.2.4版本。值得注意的是,该软件版本在他发布较早推文约一周后才开始向全体FSD用户推送。
马斯克的过往表现引发外界对 FSD 技术的质疑
距离6月份推出无人驾驶出租车只有几个月时间,但特斯拉首席执行官马斯克尚未公布任何可供独立验证的数据,以证明他所说的FSD已准备好应用在无人监督的全自动驾驶出租车上。
相比之下,Waymo等竞争对手会向州监管机构报告人工接管事件。但特斯拉多年来却一直利用法律漏洞,拒绝这一透明度要求。
马斯克还多次发表与事实不符的言论。
特斯拉的AI总监埃勒斯瓦米在法庭作证时称,马斯克曾指示他修改一条营销视频,以误导消费者对特斯拉FSD能力的认知。
最近,马斯克承认,搭载旧版 AI3 推理计算机的特斯拉车型实际上无法实现他此前宣称的“所有 2016 年后生产的车辆均具备自动驾驶能力”。
他计划为已购买 FSD 的客户提供最新一代硬件升级服务,但具体如何实施以及所需成本仍不明确。(财富中文网)
译者:刘进龙
审校:汪皓
• Jonathan Challinger claims his Cybertruck drove headlong into a streetlight while in Full Self-Driving mode. Elon Musk wants the technology to be ready for a June robotaxi launch.
Wrapped around a pole with its right wheel dangling, the image of a wrecked Tesla Cybertruck lying motionless on the side of the road is shocking. Driver Jonathan Challinger posted the undated picture on Sunday. He claims Tesla’s automated Full Self-Driving (FSD) software caused his vehicle to crash into a light post while he wasn’t looking.
While Challinger escaped without harm, he warned others might not be so lucky. “Spread my message and help save others from the same fate or far worse,” he wrote.
The post received 2 million views, sparking fierce debate as to whether FSD is good enough to be used without humans behind the wheel.
It comes less than five months before Tesla CEO Elon Musk’s crucial launch of an autonomous driving robotaxi service, which is a core pillar supporting Tesla’s more than $1.1 trillion market cap.
According to Challinger’s account, the car failed to depart a lane that was ending, despite no vehicles that might have impeded him merging into another, and making no attempt to slow down or turn until it was too late.
Google Maps and Street View imagery show that the road layout matches the photo in Challinger’s post. An official for the Reno Police Department confirmed to Fortune that there was a crash involving a driver named Challinger on Feb. 6, but declined to give further details pending a full report being filed.
Challinger tagged Musk, AI director Ashok Elluswamy, Tesla’s entire AI team, and Cybertruck lead engineer Wes Morrill in the tweet.
The carmaker constantly collects data from FSD for training. In the past it has immediately denied crash accounts when they have not been true.
At the time of publication, Tesla had not responded to Fortune’s request for comment. Fortune also contacted Challinger for comment but did not receive a reply.
‘Big fail on my part. Don’t make the same mistake I did’
Tesla only rolled out FSD to the Cybertruck in September, a full 10 months after the vehicle launched.
The pickup has larger dimensions, a higher stance on the road, and more complex engineering—it uses all four wheels to steer—than a Tesla saloon.
One of the best-known and most impartial Tesla FSD testers attested to the plausibility of Challinger’s account of the crash.
“The situation you describe is very common, where the planner or decision-making to get in the appropriate lane early enough often gets the vehicle in a bind, where it runs out of options,” replied Chuck Cook, who was tagged in the post by Challinger. “You are NOT the only one.”
Challinger was quick to admit negligence and accept ultimate responsibility for failing to supervise the system, as Tesla requires of all its owners who use FSD.
“Big fail on my part, obviously. Don’t make the same mistake I did. Pay attention. It can happen,” he warned, requesting a means to deliver dashcam footage to Tesla’s AI team for analysis.
Challinger, moreover, dismissed accusations that he was acting in bad faith by trying to capitalize on the scrutiny surrounding Musk, Tesla, and FSD ahead of the commercial launch in June.
“I just want to get the data to Tesla if I can. I tried everything I could think of to get in touch with them,” he said.
Doubt over date of reported crash
Challinger had previously acknowledged early last month—in a separate post he didn’t flag widely—that he had been involved in a serious accident.
Responding to a question about the Cybertruck’s structural ability to absorb energy in a frontal collision, he wrote in early January: “Having crashed mine, can confirm that it crumples just fine.”
He said he had repeatedly tried to get the dashcam footage to Tesla.
Challinger specified that the crash occurred while using FSD v13.2.4, a software version that had only been widely rolled out to all FSD users roughly a week after his earlier post.
Musk’s track record raises doubts about the technology
Just months ahead of a planned June launch, CEO Musk has yet to publish any independently verifiable data to back up his claim that FSD is ready to be used in an unsupervised, fully autonomous robotaxi.
By comparison, other rivals like Waymo report their disengagements to state regulators. Tesla, however, has used a legal loophole to avoid this transparency for years.
Musk has also repeatedly misstated facts.
Tesla’s AI director, Elluswamy, testified in court that Musk ordered him to doctor a marketing video to mislead consumers about Tesla’s FSD capabilities.
More recently, Musk admitted Teslas running on older AI3 inference computers have, in fact, failed to live up to his claim that all cars built after 2016 are capable of autonomous driving.
He plans to replace that hardware with the newest generation in those vehicles where customers purchased FSD. How exactly that can be done and at what cost is unclear.