世の中の出来事について その2

10G超えたので、その2に移行しました。なんか言いたい、ワン!

全体表示

[ リスト ]

衝突の瞬間、スピードリミットを超えて走っており、
オートパイロット機能がON状態だったという。
衝突の瞬間、自動運転も、ドライバーも、回避行動を一切とっていなかったという。
>前にも書いたけど、ドライバーはテスラからものすごい圧力を受けているので、
自動運転連続記録を破るような、不用意な回避運動をとることができないだろうと、思ったな。

運転手死亡2件目:「自動運転に準じた機能使用中」 米テスラの死亡事故


あとで、なんで勝手にハンドル操作したのかと、
高給取りのプログラマーや幹部から、反論許さず詰められてしまうだろうよ。
もちろんドライバーは危険と思ったからハンドル・ブレーキを操作したのだが、
それが後でログではっきりと危険と判断されるかどうかは、プログラマーが決める。
安易に触ると、首になるだろうね、即効!

そしてホントに事故になると、ドライバーの責任となる。
オートパイロットは、法的責任能力がないからね。そのためにドライバーを載せているのだ。
"Tesla drivers have logged more than one billion miles with Autopilot engaged, and our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance," said the firm in a statement.
結局ドライバーのせいにしているのか、テスラは?
安全と、より安全って?
「タバコを吸うのは健康にいい。そしてタバコを吸わないと更に健康にいい」と言っているのかな。
常に注意深く運転する用意のあるドライバーが適切に使用すれば、オートパイロットを使わないより更に安全になると言っている。つまり、ドライバーが駄目だから事故ったのだと、暗に言いたいらしい。
>意味ないじゃん、オートパイロット。
   責任回避、訴訟対策コメントかな、テスラは。。。

事故車の写真が赤いんだけど、、、以前の記事の事故車とは違う感じがする。。。。

Tesla Model 3: Autopilot engaged during fatal crash

  • 17 May 2019
イメージ 1
The Tesla Model 3 after the crash
A preliminary report into a fatal accident involving a Tesla Model 3 in the US has found Autopilot had been engaged 10 seconds before the crash.
The Tesla was travelling above the speed limit when it crashed into a truck towing a trailer in March 2019.
The roof of the car was sheared off in the accident and the driver was killed.
According to the report, the driver did not appear to have his hands on the wheel and neither he nor the Autopilot took any evasive action.
The investigation into the accident, which happened on a highway in Delray Beach, Florida, is being carried out by the National Transportation Safety Board (NTSB).
Tesla said it was the only time during the journey that Autopilot had been activated.
"Tesla drivers have logged more than one billion miles with Autopilot engaged, and our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance," said the firm in a statement.
Tesla does not recommend that drivers remove their hands from the wheel when using the vehicle's Autopilot feature.
Media caption
イメージ 2
イメージ 3
イメージ 4
Tesla CEO Elon Musk explained how the Autopilot system worked during its unveiling in October 2014
On 23 March 2019, a Tesla Model X crashed into a roadside barrier and caught fire while on Autopilot, which also resulted in the death of the driver.
On that occasion Tesla did not reveal whether Autopilot had spotted the barrier.
In May 2016 another driver of a Tesla car died when his car failed to spot a lorry crossing its path.
The driver was found to have used Autopilot for 37 minutes but only had his hands on the wheel for 25 seconds.

Tesla in fatal California crash was on Autopilot

  • 31 March 2018
Electric carmaker Tesla says a vehicle involved in a fatal crash in California was in Autopilot mode, raising further questions about the safety of self-driving technology.
イメージ 5

One of the company's Model X cars crashed into a roadside barrier and caught fire on 23 March.
Tesla says Autopilot was engaged at the time of the accident involving the driver, 38, who died soon afterwards.
But they did not say whether the system had detected the concrete barrier.
"The driver had received several visual and one audible hands-on warning earlier in the drive," a statement on the company's website said.
"The driver's hands were not detected on the wheel for six seconds prior to the collision."
"The driver had about five seconds and 150m (490ft) of unobstructed view of the concrete divider... but the vehicle logs show that no action was taken," the statement added.
Tesla's Autopilot system does some of the things a fully autonomous machine can do. It can brake, accelerate and steer by itself under certain conditions, but it is classified as a driver assistance system, is not intended to operate independently and as such the driver is meant to have their hands on the wheel at all times.
In 2016, a Tesla driver was killed in Florida when his car failed to spot a lorry crossing its path.
It led the company to introduce new safety measures, including turning off Autopilot and bringing the car to a halt if the driver lets go of the wheel for too long.
Media captionUber dashcam footage shows moment before fatal impact
The accident in California comes at a difficult time for self-driving technology.
Earlier this month, Uber was forbidden from resuming self-driving tests in the US state of Arizona.
It followed a fatal crash in the state in which an autonomous vehicle hit a woman who was walking her bike across the road.
It was thought to be the first time an autonomous car had been involved in a fatal collision with a pedestrian.
The company suspended all self-driving tests in North America after the accident.

閉じる コメント(0)

コメント投稿

顔アイコン

顔アイコン・表示画像の選択

名前パスワードブログ
絵文字
×
  • オリジナル
  • SoftBank1
  • SoftBank2
  • SoftBank3
  • SoftBank4
  • docomo1
  • docomo2
  • au1
  • au2
  • au3
  • au4
投稿

.


プライバシー -  利用規約 -  メディアステートメント -  ガイドライン -  順守事項 -  ご意見・ご要望 -  ヘルプ・お問い合わせ

Copyright (C) 2019 Yahoo Japan Corporation. All Rights Reserved.

みんなの更新記事