Page 1 of 5 123 ... LastLast
Results 1 to 10 of 44

Thread: LE Thoughts on Autonomous Cars

  1. #1

    LE Thoughts on Autonomous Cars

    I'm posing the question here, as you guys are the first ones to see the aftermath of poor driving decisions.

    Based on your experience in the field with accidents and investigations thereof, would letting a computer drive instead of a human be a net benefit or liability on the nations roads?

    Given what I've seen on my commute I'd wager letting a Abacus drive is better then most people, but that's a conclusion outside of my lane so to speak.
    The Minority Marksman.
    "When you meet a swordsman, draw your sword: Do not recite poetry to one who is not a poet."
    -a Ch'an Buddhist axiom.

  2. #2
    Member
    Join Date
    May 2016
    Location
    Dallas
    I think it's kind of like airplanes. Most of the time autopilot is fine, it's those rare moments where you need someone behind at the controls that is paying attention.

    In a lot of ways I think automated cars that take human behavior out of the equation and normalize traffic, that is every car travels at the same speed, gets up to speed and merges correctly, maintains an actual safe following distance, etc, would substantially increase the safety of the roads and increase the average speed of traffic. The problem is when we mix computer controls that just aren't up to the task of dealing with other asshole drivers on the road and asshole drivers.

  3. #3
    Member Peally's Avatar
    Join Date
    Mar 2014
    Location
    Wisconsin, USA
    I trust computers about as much as I trust drunks.

    Non-LEO opinion though.
    Last edited by Peally; 07-01-2016 at 10:09 AM.
    Semper Gumby, Always Flexible

  4. #4
    Site Supporter
    Join Date
    Aug 2011
    Location
    TEXAS !
    You mean like this?

    Tesla driver using Autopilot feature killed by tractor trailer

    http://www.foxnews.com/leisure/2016/...actor-trailer/

  5. #5
    STAFF Hambo's Avatar
    Join Date
    Aug 2014
    Location
    Behind the Photonic Curtain
    Quote Originally Posted by txdpd View Post
    In a lot of ways I think automated cars that take human behavior out of the equation and normalize traffic, that is every car travels at the same speed, gets up to speed and merges correctly, maintains an actual safe following distance, etc, would substantially increase the safety of the roads and increase the average speed of traffic. The problem is when we mix computer controls that just aren't up to the task of dealing with other asshole drivers on the road and asshole drivers.
    Yeah, this ^^^^. If everybody got autopilot tomorrow it would change some things. Adding auto pilot slowly over years with buttheads still making their own decisions...not much change. The most common wreck I see here is: distracted/stoned butthead plows into the rear of a car stopped at a red light. There's nothing autopilot can do about that unless they add rear facing Hellfire launchers.
    "Gunfighting is a thinking man's game. So we might want to bring thinking back into it."-MDFA

    Beware of my temper, and the dog that I've found...

  6. #6
    Site Supporter Tamara's Avatar
    Join Date
    Feb 2011
    Location
    In free-range, non-GMO, organic, fair trade Broad Ripple, IN
    Quote Originally Posted by HCM View Post
    You mean like this?

    Tesla driver using Autopilot feature killed by tractor trailer

    http://www.foxnews.com/leisure/2016/...actor-trailer/
    In the fine print, of course, is the fact that this is the first autopilot fatality in 130 million vehicle-miles. The US average is one every 94 million vehicle miles. But that's not very sensational, so... Quick! We need a government investigation and maybe some new laws. And perhaps a new bureaucracy with job openings we can bestow on the relatives of campaign contributors.
    Last edited by Tamara; 07-01-2016 at 01:56 PM.
    Books. Bikes. Boomsticks.

    I can explain it to you. I can’t understand it for you.

  7. #7
    Site Supporter
    Join Date
    Aug 2011
    Location
    TEXAS !
    Quote Originally Posted by Hambo View Post
    Yeah, this ^^^^. If everybody got autopilot tomorrow it would change some things. Adding auto pilot slowly over years with buttheads still making their own decisions...not much change. The most common wreck I see here is: distracted/stoned butthead plows into the rear of a car stopped at a red light. There's nothing autopilot can do about that unless they add rear facing Hellfire launchers.
    Not a bad idea.

    Seriously, I agree. Autonomous cars could work if ALL cars on the road were autonomous. Mixing is where you will see problems.

  8. #8
    Modding this sack of shit BehindBlueI's's Avatar
    Join Date
    Mar 2015
    Location
    Midwest
    I've not worked traffic in awhile, but with the assumption things haven't changed much:

    Serious accidents are usually caused by one of a hand full of causes. Failure to yield on a left turn being a big one, as well as following too closely. Add in failure to maintain a lane of travel and disregard traffic signal/sign and you've got most of them. Computers don't drive with ego, don't drink, don't text, etc. I suspect that would eliminate most of those crashes.

    I think self driving cars are inevitable, and will follow a progression of more and more computer and less and less human. Maybe humans drive on city streets, computer takes over on the interstate (which is less complicated), and then computers are good at city streets and we only drive in parking lots, then 100% computer...or something along those lines. I think, as a whole, they will also be safer.

    What will be interesting for LEO is if all the cars obey all the traffic laws, how many other crimes will go undetected. Traffic stops are like boxes of chocolates, you never know what you're going to get (you just read that in Forrest Gump's voice) and a lot of dope, illegal guns, fugitives, etc. are detected on traffic stops. It'll be a lot easier to run guns and dope if there's no interdiction stops.

  9. #9
    Quote Originally Posted by Tamara View Post
    In the fine print, of course, is the fact that this is the first autopilot fatality in 130 million vehicle-miles. The US average is one every 94 million vehicle miles. But that's not very sensational, so... Quick! We need a government investigation and maybe some new laws. And perhaps a new bureaucracy with job openings we can bestow on the relatives of campaign contributors.
    An important fact that shouldn't go overlooked, but cannot be taken simply at face value.

    Dying because you fucked up is very different than dying because the computer fucked up (or, more likely, the people who designed and built it).

    On a personal level, as far as reasoning go, an idealized logical driver would have to figure out if their own driving skill makes their death probability higher or lower than that of the autopilot. Certainly the US average isn't applicable to everyone. It probably wouldn't even be the average for everyone on this forum; far from it, I'd guess. Even in the presence of this sort of reasoning the question of whose hands the fate of the driver is in would likely still be important to many people.

    On a societal level, we need to decide if we're willing to be on a whole "safer" by living or dying based on the collective skills, triumphs, and failures of The People (in this case: the coders, the engineers, etc.) as opposed to our fate being largely in our own hands. It's a visceral manifestation of the classic individualism vs. collectivism issue. Certainly we face this issue even today when using any sort of risk-inherent technology, and in a large variety of other places, but it is on an entirely different level when dealing with near-AI levels of technology that seek to directly replace the sort of high-level decision making tasks our brains have been evolved to execute.

    The fact that these issues seem largely to be decided already for us---I'm guessing most people think some sort of mandatory autopilot legislation is quite likely years from now when the technology is ripe---is a truly frightening view into modern philosophy and the death of individual responsibility. "The people must be saved from themselves."

    It has fundamental relationships to the 2A argument. Would you rather live in a world where our government did it's best to avoid allowing anyone to be armed, or would you rather have the right to be armed and assume the additional risk associated with all the idiots around you having that right too?
    Last edited by GRV; 07-01-2016 at 04:07 PM.

  10. #10
    Modding this sack of shit BehindBlueI's's Avatar
    Join Date
    Mar 2015
    Location
    Midwest
    Quote Originally Posted by dove View Post
    Dying because you fucked up is very different than dying because the computer fucked up (or, more likely, the people who designed and built it).
    Where does dying because the other driver fucked up fit in to this spectrum?

User Tag List

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •