bob p 1/19/2018 1:03 PM
Self-Driving Cars Can't See Black
I just caught something on Bloomberg TV -- they were staying that the people who are designing self-driving cars are having problems with the sensors on those cars failing to accurately identify cars that are painted with black paint.

This reminds me of a recent thread in which someone commented that self-driving cars had trouble differentiating trees from roadways, and they ended up driving off of the roads into trees. The common element seems to be trouble with optical sensors accurately recognizing dark objects.

The proposed solution is to require changes to the dark paint that is used in non-self-driving cars, so that self-driving cars are able to accurately recognize them. A researcher at PPG decided that the solution was to use a solution that somehow comes from the eggplant -- using a translucent color top coat of paint and a reflective metallic base coat underneath, to make the non-self-driving car more visible to the self-driving car.

What a strange idea. Instead of requiring the people who are making self-driving cars to assume the responsibility on their own to make their product work safely, the solution being suggested by the paint people is to require changes to the paint formulations for everyone else, to make it easy for the self driving cars. If the guy at PPG has his way, black cars aren't going to be made out of primer, black paint and a clear coat any more. The new paradigm will require a candy-apple type of paint job, with a primer, a metallic base, a translucent color coat, and a clear coat on top. Everyone who knows cars knows that kind of finish costs more.

Somehow I get the impression that we're all going to end up paying in some way to finance the self-driving car thing, even if we don't want to be part of it.

Sorry, no link. Bloomberg TV says to check Bloomberg magazine.
 
g1 1/19/2018 1:35 PM
This will only be while the transition is being made. Don't kid yourself about a world where there are both self-driving and non self-driving vehicles.
 
nevetslab 1/19/2018 1:52 PM
I wonder how the optics deal with dirty cars?
 
nosaj 1/19/2018 2:22 PM
Quote Originally Posted by nevetslab View Post
I wonder how the optics deal with dirty cars?
Yea the one thing the engineers won't think of. Leave it to the techs to figure out the real world issues.



nosaj
 
bob p 1/19/2018 2:26 PM
Quote Originally Posted by g1 View Post
This will only be while the transition is being made. Don't kid yourself about a world where there are both self-driving and non self-driving vehicles.
Said by the guy who still drives carbureted vehicles!

I don't see it happening in my lifetime. Afterwards? I don't really care.

Expecting that everyone will adopt self-driving cars isn't a logical expectation. Sure, there will be a large number of people will just do what they're told to do in exchange for their entitlement check, but the enthusiast demographic views things entirely differently. There are people who like to drive for the fun of it, and they aren't going to be interested in giving up their driving privileges or their Camaro, Corvette or 4x4 pickup in exchange for a ride in a Johnny Cab.



I particularly liked that segment where Arnold got pissed off at the computerized car and ripped the Johnny out of the Cab... and when the cab decided to try to run over Arnold for not paying.
 
bob p 1/19/2018 2:30 PM
Learning to drive and gaining independence is part of growing up in America. I hate to think about a time when kids no longer look forward to getting their drivers' license as a rite of passage and as a symbol of growing up. Somehow, replacing cars with autonomous vehicles seems like we're sacrificing part of American culture. Why is this being pushed on us?

I don't see an exclusive transition happening any time soon. Driving is too much a part of American culture and I don't see people willingly giving it up. For it to happen, people will have to have choice taken away from them.

Even if autonomous cars become popular, there will still be vintage, antique, and classic cars that get driven on the roadways. Some are garage queens, for sure, but a small number of them do get driven as daily drivers. That may not make sense to some people, but it makes sense to guys who are tired of seeing the price for a new car go up and up and up to absurd levels. Eventually the price of a new car rises to the point that some people consider whether they'd rather spend $30,000 on a classic car and drive it into the ground as a daily driver, or make payments on a $30,000 on a 2018 Honda Accord EX-L. Personally, I'd rather have a lot more fun driving the classic car instead of being just another guy driving around in an Accord or a Minivan, but that's just me.

The idea that "this will be only while the transition is being made" doesn't make sense to me. People still ride horses on roadways. People still like their 1950 Mercs. Around here people drive golf carts on the streets in the summertime. The local young Hispianic enthusiasts like to cruise the neighborhood streets in little Civics with tiny wide wheels, while the local older Hispanic guys like bigger cars like classic Impalas. And the local Black guys like taking huge old 1970-80s cars to build customized "cribs" that don't look anything like new cars, and they drive them everywhere they go. Some of those cars are 50 years old and are being restored and customized as daily drivers.

Car guys like their cars. They customize them to fit their personalities. They view them as an expression of their individuality, and many of them aren't going to be willing to be homogenized into owning a Google self-driving car. To take away their car keys you'll have to pry them from their cold dead fingers. They won't be buying Google cars and they won't be painting their cars in the special color that's required to make them more recognizable to self-driving cars. To avoid vehicles on the road, self-driving cars need to be designed to recognize whatever is on the street, without assuming that the entire world is going to get repainted to satisfy the needs of autonomous car designers who can't make things work on their own.

This brings up the question -- why is there this sudden big push for self driving cars? Who really wants them, and why are all of the car manufacturers suddenly spending lots of money on self-driving cars?

There's no question that the tech companies are just trying to push tech that they can force into the auto industry to bring them profits. And there's no question that the auto companies would like to cut costs on cars by eliminating the driving interface and the dashboard, which are expensive parts of a car's cost. And there's no doubt that the Big 3 are having trouble meeting the new 2017 EPA Cafe standards, which has prompted a major re-think of car designs in Detroit.

I can see that it's about making the new cars cheaper to produce, and decreasing weight by eliminating parts to comply with the more rigorous CAFE standards. But are people going to be eager to buy them? I'm not so sure. The car culture is pretty strong in America. I think it's going to take generations of brainwashing to breed the love of cars out of people before autonomous cars will become universally accepted. Only then will we have universal acceptance of self-driving cars as the exclusive means of transportation. I think that the only people who will accept that paradigm are the people who have never known anything else, and don't know what they're missing. I don't see that happening in my lifetime.
 
eschertron 1/19/2018 2:56 PM
Quote Originally Posted by bob p View Post
Learning to drive and gaining independence is part of growing up in America.
My wife and I have seen the film 'American Graffiti' on TV a couple time in recent months. Yep, we sure loved our cars... what, about 50 years ago? You're showing our age, Bob

My kids, in love with cars, not so much. A car is nice to have, but it represents responsibility and regulation now more than freedom or individuality. Give me good public transportation any day.* Maybe the self-driving cars are seen as a stepping stone to (or integrated with) clean, efficient mass transit. Now that's a picture of the future that's been around for decades!



* like someplace like, um... Chicago!
 
Enzo 1/19/2018 6:13 PM
We can always find exceptions. We can read about a guy who rode a circus elephant down teh freeway, at least until they stop him. But those are not descriptive of the real world, just odd stories.

here in Michigan there is serious concern over the ability for these things to even see the lanes. Roads are often snow covered. I often, especially when I lived in rural Michigan, could only tell where my road was by the mailboxes and stuff on the sides. During bad condition, the two lanes of the interstate are often reduced to one pair of tire tracks down the center. These cars need to be able to figure that crap out.

The tree thing is a fun story, but is hardly the case today.

I don't like the self driving car thing myself. They scare me. But the idea behind them is they foresee people owning fewer cars. Instead of buying, maintaining, and insuring a car, these things will be around like an automatic Uber. Groups of people could own one and share it too. Think of them as a "people mover" without a fixed route.

I doubt they will be available to you and me very soon. They want to sell them in fleets.
 
The Dude 1/19/2018 6:19 PM
There was an urban legend back when cruise control came out that a guy driving a van set the cruise, got up, and went to the back to make a sandwich or something- thinking the car would drive itself. I have no idea if it's actually true or not, but it wouldn't surprise me and I always thought it was a funny story. Probably not so funny for the rest of the drivers on that road (if true).
 
Enzo 1/19/2018 6:23 PM
There is a series of books, starting with the "Vanishing Hitchhiker" by Jan Herald Brunvand. Collections of urban myths. FUll of fun stories like that.
 
Chuck H 1/19/2018 6:52 PM
Quote Originally Posted by The Dude View Post
There was an urban legend back when cruise control came out that a guy driving a van set the cruise, got up, and went to the back to make a sandwich or something- thinking the car would drive itself. I have no idea if it's actually true or not, but it wouldn't surprise me and I always thought it was a funny story. Probably not so funny for the rest of the drivers on that road (if true).
That reminds me of a joke...

When I pass on I hope to go like my grandfather. Peacefully in his sleep and not screaming in terror like his passengers.

And I love those sorts of books Enzo. I'll be looking that up on my Kindle tonight

The first thing I thought when I saw the thread title was "Robotic cars can be racist?"
 
g1 1/19/2018 6:58 PM
Bob, your age is showing. Was wearing onions on your belt the style at the time? (just kidding, obscure Simpsons reference)
I hope you don't think what I believe is the way things are is the same as how I would like them to be.
I'm also big on car culture, but I think that the choice you allude to has already been made. The younger drivers have made the choice and they don't care if they lose the ability to drive as long as they can keep their tech (smart phones, onboard video etc.). We can't have both as it's too dangerous as is borne out by the statistics.
You think people don't want to give up car culture, I think they already have. The generation that will miss it (us) is almost gone.
The word 'driving' in an automotive sense will someday sound as quaint as 'horseless carriage'. Maybe not as soon as I think, but it seems inevitable. I'm not saying enthusiasts won't be able to keep their cars, they just won't be allowed on the automated roadways, like horses are not allowed on highways.

P.S. They don't even have to outlaw anything, they can just raise the insurance to the point everyone quits using them.
 
Enzo 1/19/2018 6:59 PM
I liked the books. Brunvand wrote the Vanishing Hitchiker, The Choking Doberman, and The Mexican Pet. I know he wrote others. There are other urban legend books of course. And the snopes.com web site watches for people who believe them.
 
Chuck H 1/19/2018 7:35 PM
"But we had to wear the yellow onions. We couldn't get the white onions on account of the war!"
 
eschertron 1/19/2018 7:55 PM
I have friends who are 'car guys'. Some have vintage autos, some have not-yet-historical but buffed-up rods. Most enjoy the time for cruising, or shows, or amateur night at the local drag strip. All of these activities represent why we like cars. Nobody says "Hey, let's jump in the car and go sit in traffic for the next two hours!". But with the congestion on our roads (US population as a whole has more than doubled since 1950, more in the cities) driving just doesn't have the appeal now as before. Less fresh air and more road rage.

Even in the future, people will own antique cars for the same reason people own antique anything. For the joy in possession, and the opportunity to occasionally exercise their use.
 
Chuck H 1/19/2018 8:13 PM
"My" car (everyone had one) was a 1970 Firebird. First year F body with the Ram Air hood and the Ram Air III Formula 400 ci engine. 411 gears and a Hurst shifter. All I did was install an Edelbrock high rise manifold with a Holly 4 barrel on top, some headers and a cheater cam. You couldn't beat me dragging. No way.
 
dmartn149 1/19/2018 9:53 PM
Do young people "cruise" on Friday and Saturday nights where you live? They don't here. Too busy tweeting, texting and gaming I guess.
 
Enzo 1/19/2018 11:27 PM
I think car culture is fading. i think it is a matter of what you looked up to as a kid. Now days kids hang out on their phones. Used to be there was not much else to do but gather somewhere. SO saturday night, cruise downtown, gather at the drive-in burger joint. My generation thinks a 57 Chevy is a real cool car. Stock, or drop a crate motor in it, whatever. Our kids think an Olds 442 is a cool car. Is there any car that is cool from our recent turn of the century?

We'd gather at a diner or "malt shop" for a double date. Now kids go out together, sit in some place all four staring at their phones. I don;t see parking lots full of cars with kids hanging around them now, haven't for many years. Our area had several drive-in restaurants, but they have all faded or closed.

I've watched the juke box thing fade away. My generation saw juke boxes at places we hung out at. When we grew up, we got a juke box for the basement. A zillion GM workers here in this area had tons of spare money, and I swear half of them went out, built a four stool wet bar in the basement, got a pool table, a juke box, and a pinball machine. A new song came out, you got it on a 45, stuck it in your juke. Now any jukeboxes in restaurants or bars are all internet connected so you can chose from thousands of tunes. Kids all have an ipod or something anyway. Most places just have a sattelite radio any more. I used to make a living servicing those home entertainment things, the jukes faded away. Homes now have dad's PacMan stand up arcade video game. The kids are not interested. My friends who bought and sold jukes gave up.

Kids today don't get cars to go cruising or to show off.
 
bob p 1/20/2018 5:43 AM
Quote Originally Posted by Chuck H View Post
When I pass on I hope to go like my grandfather. Peacefully in his sleep and not screaming in terror like his passengers.
I laughed like hell at that one, Chuck. I'm wiping coffee off of my monitor now, thanks.

The reason I thought it was so funny is because I had that happen to me once. I was riding home from the airport after taking the red-eye, coming home in one of those short-bus airport "limousines". We're heading down the interstate in the middle of the night and the driver naps off... and the bus starts gradually edging off of the road onto the shoulder. A couple of people started screaming at him... no response. I was right behind him, so I got up and smacked him on the back of the head. Problem solved.
 
bob p 1/20/2018 5:46 AM
Quote Originally Posted by eschertron View Post
Even in the future, people will own antique cars for the same reason people own antique anything. For the joy in possession, and the opportunity to occasionally exercise their use.
When it comes to the car culture, cars are like tube amps.
 
Chuck H 1/20/2018 6:30 AM
Quote Originally Posted by dmartn149 View Post
Do young people "cruise" on Friday and Saturday nights where you live? They don't here. Too busy tweeting, texting and gaming I guess.
When I was young cruising was still cool. All the video game arcades closed up at 9:00 or 10:00. We'd even rev motors and drag off the red at intersections. The cops were pretty good about it even though it was was a main street in a suburban shopping district. Good times. Even then there were newer, hip cars with loud sound systems and faux air foils and such. If you had a muscle car (like mine) you were a nerd/motor head.

Quote Originally Posted by bob p View Post
When it comes to the car culture, cars are like tube amps.
That's always been my perception too. Of course this evolution has been in progress since Enzo's days as he outlined above. My family had a jukebox because my dad was a tinkerer and liked fixing mechanized things. So he bought a broken one just to fix it. When he was done we took out the 45's we didn't like and replaced them with ones we did. Lot's of fun trips to Tower Records. I was a 70's kid and I really think I got the best of mine and the era's fore and aft. Sometimes I felt progressive and other times I felt like the last of a generation before my time. But it was great.

Self driving cars? That wouldn't have been any fun at all. That's probably why someone is working on the technology. Another way to take something fun away from the sheeple and turn us all into automatons
 
bob p 1/20/2018 6:45 AM
Quote Originally Posted by Enzo View Post
I think car culture is fading... Is there any car that is cool from our recent turn of the century?
you've hit the nail on the head, Enzo. Car culture is fading, because there aren't many cool cars any more. Why? Because instead of building cars that people like, the Big 3 are building cars that the government likes.

Our Federal Government has mandated what kind of cars have to be manufactured, with things like safety standards and fuel economy standards. Gone are those days when you could be riding in the front seat of the wagon as a kid, and when Dad had to hit the brakes he'd stick his right arm out to keep you in the seat. Now the cars have airbags and the kids have to be buckled down in the back seat of the minivan or Dad goes to jail. And cars aren't easy to service any more. Instead of tuning a carb, now there's a computer interface that controls fuel injection, and the days of a kid getting an old junker to hot rod it are fading fast. And the cars that are available just aren't cool any more, so nobody even wants to spend the time hot rodding them. Now cars are designed for the collective good of society rather than the individual good of the owner. As a result there is just no reason to love a car like the ones they're putting out now:

2018 Smart Fortwo
[IMG]https://images.duckduckgo.com/iu/?u=http%3A%2F%2Fautocarupdates.com%2Fwp-content%2Fuploads%2F2017%2F03%2F2018-smart-fortwo-electric-drive-review-leak-1920-x-1080.jpg&f=1[/IMG]

And this is what we have to look forward to:

Google Self-Driving Car
[IMG]http://music-electronics-forum.com/attachment.php?attachmentid=46677&d=1516454529[/IMG]


The reason that car culture is dying is because the government is controlling what kind of cars can be manufactured, and those controls/mandates have rendered affordable cars undesirable and desirable cars unaffordable; if you want a fugly econobox, you can have any car that you like. If you want a full size Suburban, it'll cost you $70,000 now. Not kidding. If Chuck wants a real musclecar, he'll have to pay $70,000 for a Dodge Hellcat. Not kidding about that either.


Dodge Hellcat: $70,000 for 700 HP
[IMG]http://music-electronics-forum.com/attachment.php?attachmentid=46676&d=1516453652[/IMG]

Why are the prices so damned high? Is it a coincidence that those two cars both cost $70,000?

It's the CAFE Standards. It's not that a Suburban costs $70,000 to build now -- it's that GM has to price the Suburban so that few people will buy them, so that they can meet the fleet fuel economy standards set by the Feds. Big cars are being legislated out of the market by making them unaffordable by the common man. It's not so much that car culture is dying. It'd be more accurate to say that it's being killed off through legislation.
 
bob p 1/20/2018 6:52 AM
Quote Originally Posted by Chuck H View Post
Self driving cars? That wouldn't have been any fun at all. That's probably why someone is working on the technology. Another way to take something fun away from the sheeple and turn us all into automatons
I blame those evil fuck billionaires at google. They're out to subjugate the world.

Of course I'm kidding... or am I? Have you heard about some of the mandatory employee training that they push on people at Google? They're out to brainwash the world.
 
Chuck H 1/20/2018 7:06 AM
Quote Originally Posted by bob p View Post
I blame those evil fuck billionaires at google. They're out to subjugate the world.

Of course I'm kidding... or am I? Have you heard about some of the mandatory employee training that they push on people at Google? They're out to brainwash the world.
Driving their smart cars around with that smug look of green planet superiority on their pasty faces

It's just how it's going down, man. And it's not all bad. In fact most of it is for the best. We can certainly crab about it because we miss familiarity. We'll never fully acclimate like the current generation and that leaves us behind. Worse than because we're disagreed with, we're utterly unconsidered like we don't matter. In that light you have to keep up or get run over.
 
bob p 1/20/2018 7:25 AM
It's unfortunate, but when people amass too much wealth they insist on trying to change the world, and they punish people who don't agree with them. Google isn't an exception. Check this out -- I found it when it appeared as one of those irritating youtube ads that gets in the way of watching videos.

 
Chuck H 1/20/2018 8:09 AM
Quote Originally Posted by bob p View Post
It's unfortunate, but when people amass too much wealth they insist on trying to change the world, and they punish people who don't agree with them. Google isn't an exception.
Well, how else is the world going to change? What worked before is now recognized to be killing the planet. And while I think that modern gender issue perception is utterly foolish I don't expect everyone to agree. And I don't have the power to change anything, but *oogle does. If 80% of what *oogle promotes is good, that's good. I don't expect panacea from circumstance, just something I can work with going in the generally correct direction. In the 20's and 30's if you didn't smoke and drink like a fiend you couldn't get a job in business. A different power and it influenced to the best of it's ability the direction of cultural trends. The world of people is going to change. It always had. It's never been perfect because people are imperfect. *oogle isn't responsible for evil, they are just a part of where we're collectively going. That doesn't mean that anyone should ever stop trying to idealize and make whatever is wrong better. Of course we should. If we didn't every *oogle mogul would be drinking a scotch, smoking a cigarette and blathering some racist, sexist opinions to his receptive co workers. In other words, good on James Damore for making a noise about injustice as he see's it. It's important and should be done. But I don't see it as a revelation and I'm not ready to stand on the corner with a cardboard sign that say's "*oogle is mind control".
 
nosaj 1/20/2018 8:14 AM
Quote Originally Posted by Chuck H View Post
Well, how else is the world going to change? What worked before is now recognized to be killing the planet. And while I think that modern gender issue perception is utterly foolish I don't expect everyone to agree. And I don't have the power to change anything, but *oogle does. If 80% of what *oogle promotes is good, that's good. I don't expect panacea from circumstance, just something I can work with going in the generally correct direction. In the 20's and 30's if you didn't smoke and drink like a fiend you couldn't get a job in business. A different power and it influenced to the best of it's ability the direction of cultural trends. The world of people is going to change. It always had. It's never been perfect because people are imperfect. *oogle isn't responsible for evil, they are just a part of where we're collectively going. That doesn't mean that anyone should ever stop trying to idealize and make whatever is wrong better. Of course we should. If we didn't every *oogle mogul would be drinking a scotch, smoking a cigarette and blathering some racist, sexist opinions to his receptive co workers. In other words, good on James Damore for making a noise about injustice as he see's it. It's important and should be done. But I don't see it as a revelation and I'm not ready to stand on the corner with a cardboard sign that say's "*oogle is mind control".
What's the matter the tin foil hat doesn't go with your ward robe? just joshin here.

nosaj
 
bob p 1/20/2018 8:25 AM
Quote Originally Posted by Chuck H View Post
But I don't see it as a revelation and I'm not ready to stand on the corner with a cardboard sign that say's "*oogle is mind control".
I'm not going to stand out on the streetcorner either. It's friggin' cold here in Chicago and the tin foil hat just can't keep my ears warm.
 
Chuck H 1/20/2018 8:28 AM
Quote Originally Posted by nosaj View Post
What's the matter the tin foil hat doesn't go with your ward robe? just joshin here.

nosaj
Oh, I'm a nutty conspiracy believer to be sure. I just don't see it in *oogle any more than I see it in pop music or fashion. It's just part of where we're going, for now. Erasing gender bias, even WRT the invented genders, on some levels is a good thing. But, as stated by Mr. Damore, it's probably better done via recognition of differences rather than saying we're all the same. Didn't we learn anything trying to do that with racism!?! It's just more square pegs in round holes. Wait a minute!.. Huh?..

It just occurred to me that there may be a cultural control mechanism used by the top 1% that racism was facilitating and that since racism is diminishing this new issue of gender roles and bias may be a culturally planted replacement to maintain power!

Ok, I'm putting the tin foil hat back on.
 
Steve A. 1/23/2018 10:03 AM
As others have said I will not ride in a self-driving vehicle unless there is a steering wheel and brake pedal that I can use if the computer screws up. And yes, I do wear both a belt and suspenders.

In answer to the question of why self-driving vehicles are being pushed on the public they will eventually reduce vehicle accidents, injuries and deaths, making it much safer to be on the roadways.

For visual lane marking I think that a standardized thermoplastic striping could be used to simplify detection by the self-driving computers. Perhaps something could be added to allow it to be detected by non-visual methods to reduce problems due to poor visibility. Thermoplastic striping has an expected life of 3 to 6 years so it would need to be reapplied regularly.

https://en.m.wikipedia.org/wiki/Road...#Thermoplastic


Here is a paragraph I just deleted for reasons listed below the quote:

However there are many changes to the roadways which will be required. Traditional lane markers will not work. I think that all asphalt roads should have markers embedded in the pavement, perhaps every 12 feet for straight stretches and every 6 feet for curves. I have no idea what material could be used that would require no power but could be accurately detected by the self-driving car. Hey, we could use spent uranium rods from reactors instead of burying them in Nevada! Just kidding...
I just deleted that because once "geo" markers are inserted into asphalt it could be difficult to remove them if the lanes were changed. I had initially thought of placing them along the lines between lanes but it'd make more sense to place them in the middle of the lane... if there was an easy way to remove them if the lanes changed. I do think that we might eventually switch to a system like that if all of the bugs could be worked out.

Actually it might not be that difficult to remove the "geo" markers since a machine could locate them exactly and an appropriately sized hole saw could extract them, with the hole filled in immediately with asphalt — all done by machinery so as to not put highway workers in danger.

In any case, I believe that lane markers will need to be upgraded as necessary to minimum standards which will require a large investment in our roadway infrastructure as was done here after WWII when our military leaders were very impressed by the roadways in Germany.

During the campaign Trump promised to invest heavily in our infrastructure which would cover the expense of the roadway marking upgrades but whoops! the GOP just gave away $1 trillion to big business and the 1% over the next 10 years so major infrastructure repairs and upgrades will need to wait until 2028.

To be fair if the massive tax giveaway actually does increase the annual growth of our GDP by 3 to 5% as predicted by experts hand-picked by the GOP then it will not be known as Trump's Boondoggle among future historians. "Hand-picked experts" has a nice ring to it... they obviously must be the best experts for us to listen to — right? NOT!

Steve A.

P.S. There is one application for self-driving vehicles that is apparently ready to go once testing has been completed: convoys of, say, 5 or 6 trucks on the freeway with a human driver in the front vehicle called "platooning" in the Engadget article.

https://www.wired.com/2016/07/armys-...repare-battle/

https://www.engadget.com/2017/08/25/...ooning-trials/

P.P.S. As for America's love affair with automobiles and trucks going back almost 100 years, the newer generations seem to be falling out of love. With most of Generation X and Millennials doomed to lower wages than Baby Boomers the necessity of owning two cars (or even one!) is being questioned for those living in metropolitan areas with adequate public transportation. Adding the purchase price and the cost of insurance, fuel, upkeep and repairs of a second car it is often cheaper to just use Uber or public transportation as necessary. For people living in many cities finding a parking spot can be a real challenge and as home delivery services keep getting cheaper and cheaper it cold be cheaper to have purchases too large for public transportation and Uber shipped to your house compared to the costs of owning a second vehicle.
 
bob p 1/23/2018 10:34 AM
I'm not sure how partisan politics plays into self-driving cars not being able to see black, but I'm sure there's some contributing factor that I'd rather not know about.

The problem of lane identification brings up an important part of the problem -- infrastructure build-out. If self-driving cars require roadways to be modified to allow self-driving cars to navigate, then the condition of roads is going to have to be a lot higher than it is today. Around here there are plenty of roads that have fallen into disrepair, and expecting the roadways to be maintained in a higher state of repair than they are in today would be a very lofty goal. And an expensive one.

Who should pay for that expense? Not being a guy who ever wants to ride in a self-driving car, I don't want to pay for it. I'd rather see our already crumbling roads get fixed. If there are people who want to own self-driving cars, then let them pay for upgrades needed to make the roadways suitable for their special-needs cars. I don't think it's fair to distribute that cost to everyone. Why not just build a road-compatibility upgrade tax into the price of the self-driving car?

One thing that will be an important consideration is maintenance of the roadways. If self-driving cars require embedded markers to be placed in the roadways, that'll work just fine on the freeways, but it won't work so well on city streets and country roads that are often neglected and fall into disrepair. There are still a lot of gravel and dirt roads in remote areas of the country that aren't likely to ever get "smart roadway" upgrades. In some respects I think it'd be fair to bill those special-needs costs to the owners of those vehicles that have special needs, rather than forcing everyone to subsidize their vehicle purchase. If those costs get billed to the population at large, then there are going to be rural folks who drive on gravel roads who have to pay for a fancy road somewhere else that they never get to drive on.

The cost of upgrading all roads to make them self-driving compatible makes it an unrealistic goal. There are just too many miles of roadway in the USA to make all of them compatible. The result I think will be to make selective roadways self-driving compatible, perhaps the interstates in the most densely populated areas. But there are just too many miles of roadway to make all roadways compatible. An in the absense of having all roadways being compatible, we're still going to be faced with relying on a sensor in the self-driving car to collect actionable data. As the Tesla accident has shown us (where the Tesla failed to discriminate a white semi trailer from a bright back lit sky and killed the passenger), optical recognition alone isn't good enough yet.
 
bob p 1/23/2018 10:41 AM
Quote Originally Posted by Steve A. View Post
As others have said I will not ride in a self-driving vehicle unless there is a steering wheel and brake pedal that I can use if the computer screws up. And yes, I do wear both a belt and suspenders.
I was watching a BBC special on youtube that addressed some of the ethical dilemmas that they thought would come with self-driving cars. I don't have the link but it's there if anyone is intersted.

One of the issues that they brought up was the moral dilemma of how does the computer make the decision about who to kill in an unavoidable accident scenario. If the semi ahead of you stops, and you can't stop in time to avoid it, does the self-driving car change lanes to collide with a car on your left, a motorcycle on your right, or a pedestrian? Or what if the computer sensor in your car knows that there's only one passenger in the car, and to avoid killing you in an accident, the computer would have to decide to run into 2 pedestrians?

Being the selfish jerk that I am, I'll always choose to preserve my life over the life of a stranger, because my life is more valuable to me. I'm not sure that the computer would be programmed to think the same way though. It may not have my best interests in mind, and to me that represents a major problem.
 
eschertron 1/23/2018 10:51 AM
Quote Originally Posted by bob p View Post
...the moral dilemma of how does the computer make the decision about who to kill in an unavoidable accident scenario.
I saw that movie too!

 
Steve A. 1/23/2018 11:54 AM
Quote Originally Posted by bob p View Post
It's unfortunate, but when people amass too much wealth they insist on trying to change the world, and they punish people who don't agree with them.* Google isn't an exception.* Check this out -- I found it when it appeared as one of those irritating youtube ads that gets in the way of watching videos.

https://www.youtube.com/watch?v=f9_o42QaVnA
Yes, I consider YouTube videos published by an ultra-conservative organization like PragerU*** to be an excellent source of unbiased information... NOT!

PragerU ("Prager University") is a 501(c)3 non-profit conservative digital media organization. Despite the organization's name, Prager University is not an educational institution.
• PragerU was founded in 2009 by conservative radio talk show host Dennis Prager and radio producer and screenwriter Allen Estrin who wrote "Pocahontas II: Journey to a New World." It is not an academic institution and does not offer certifications or diplomas.
• Prager created PragerU with Estrin as his business partner in order to present his conservative views and to offset what he regards as the undermining of college education by the left. The videos usually feature a speaker who argues a particular side of a debate for about five minutes.
https://en.m.wikipedia.org/wiki/PragerU

*** The same goes for blatantly one-sided videos published by ultra-progressive organizations.

-=÷=◇=÷=○=÷=◇=÷=-

Here is a link to the memo itself from the author's website so that we can see exactly what he had to say and followed by a link to the attachment here:

https://assets.documentcloud.org/doc...ho-Chamber.pdf

[ATTACH]46723[/ATTACH]

From what I've gathered his memo is based on some questionable premises... diversity in employment does NOT suggest that men and women are equal — it says that they should be given equal opportunities.

-=÷=◇=÷=○=÷=◇=÷=-

Here is an excerpt from a Washington Post article last August:

The Google memo is a reminder that we generally don’t have free speech at work
by Jena McGregor [The Washington Post 08/08/2017]


>>> ...Legal experts note that there are some workplace protections on speech. A relevant one, in the case of the Google engineer who has been identified in media reports as the memo's author, James Damore, is that the National Labor Relations Act does protect workers who engage in “concerted activities” for their “mutual aid or protection.” In other words, said James McDonald, the managing partner of the Irvine, Calif., office of the employment law firm Fisher Phillips, it “has to be apparent that an employee is speaking for a group of employees, like saying 'I'm a spokesperson,' or at least be an invitation to engage in concerted activity.”
• Yet the memo, he said, “reads like one person's critique of Google's management philosophy as opposed to a call to action” for co-workers to “rise up and protest.” Damore, according to a report in Reuters, has said that he is exploring his legal remedies and that he submitted a charge to the National Labor Relations Board before his termination.
• A Google spokesperson said the company does not comment on individual cases or employees but said the company determined that the portion of the post that references gender stereotypes violates its code of conduct and policies against harassment and discrimination. Damore did not respond to a message sent to his LinkedIn profile or a Harvard University Web page with the same name.
• In a message published Monday, Google chief executive Sundar Pichai opened by saying that “we strongly support the right of Googlers to express themselves, and much of what was in that memo is fair to debate, regardless of whether a vast majority of Googlers disagree with it.” But “to suggest a group of our colleagues have traits that make them less biologically suited to that work is offensive and not okay. It is contrary to our basic values and our Code of Conduct.”
• Even if Damore established that his memo amounted to “concerted activity,” said William B. Gould, a professor emeritus at Stanford Law School and a former chairman of the National Labor Relations Board, Google may still be able to assert that the speech crosses a line on stereotypes about women and that it was disruptive and could create a hostile work environment. He also noted that if Damore were able to prove that he was fired because he filed a charge with the NLRB, that would be a violation of the law regardless of the charge's merits. <<<
https://www.washingtonpost.com/news/...peech-at-work/

-=÷=◇=÷=○=÷=◇=÷=-

Here is a link to an article from the Washington Post offering details on how Damore's firing was picked up by the Alt-Right, making him their poster child helping his story migrate to the mainstream press...

Analysis | How James Damore went from Google employee to right-wing Internet hero
by Abby Ohlheiser [The Washington Post 08/12/2017]
https://www.washingtonpost.com/news/...internet-hero/

-=÷=◇=÷=○=÷=◇=÷=-

BTW California is an "at will" state in regards to employment — unless specifically excluded in an employment contract an employer can fire a worker for any reason at all, at least as long as the reason is not illegal.


Steve A.

P.S. I have a long list of complaints about Google, ranging from their search engine to their Android operating system to be compiled and posted at a later date. I have not studied the 16 page PDF file of "Google’s Ideological Echo Chamber" so I have not yet formed my own opinion as to whether his firing was just or unjust but I am certainly not going to base it on a one-sided YouTube video with cute cartoon figures...

[img]http://music-electronics-forum.com/attachments/46724d1516734467-screenshot_2018-01-23-11-06-52_20180123110823219.jpg[/img]
 
Enzo 1/23/2018 12:16 PM
ethical dilemmas that they thought would come with self-driving cars.
I think this is one of those red herrings you like to point out.

The only ethical mission a car design could reasonably be expected to have is to save its occupants. It would get impossibly complex expecting the car to be aware of all possible outcomes in every situation. If the choice is to ram a small car to the side to avoid ramming a school bus ahead, how does our car know if the school bus is full of kids or just a driver? Even if it could count the kids, would it be able to determine that hitting the big high-sitting school bus would likely be less threatening to those kids on board than ramming the small car would be to its occupant. I can see a pedestrian and make a judgement, but would the car be able to tell the difference between a young healthy college kid who might leap out of the way versus an old lame person? Can we reasonably expect the car to muse whether it prefers many injured school kids or one dead driver?

The car cannot be second guessing itself, it has the primary mission of providing safety to its own occupants.
 
Steve A. 1/23/2018 12:32 PM
Quote Originally Posted by Enzo View Post
I think this is one of those red herrings you like to point out.

The only ethical mission a car design could reasonably be expected to have is to save its occupants. It would get impossibly complex expecting the car to be aware of all possible outcomes in every situation. If the choice is to ram a small car to the side to avoid ramming a school bus ahead, how does our car know if the school bus is full of kids or just a driver? Even if it could count the kids, would it be able to determine that hitting the big high-sitting school bus would likely be less threatening to those kids on board than ramming the small car would be to its occupant. I can see a pedestrian and make a judgement, but would the car be able to tell the difference between a young healthy college kid who might leap out of the way versus an old lame person? Can we reasonably expect the car to muse whether it prefers many injured school kids or one dead driver?

The car cannot be second guessing itself, it has the primary mission of providing safety to its own occupants.
With enough self-driving cars out there that scenario might never happen... there would be no decision to make as to which vehicle to run into.

Which brings up a new issue to this thread: many new cars have all sorts of devices which make driving much safer, like automatic proximity detection to help keep you from hitting the car in front of you. (Question: does that system apply the brakes itself or just alert the driver to apply the brakes? In the first case that would be a limited application of self-driving, correct?)
 
bob p 1/23/2018 12:35 PM
Quote Originally Posted by Steve A. View Post
Yes, I consider YouTube videos published by an ultra-conservative organization like PragerU*** to be an excellent source of unbiased information... NOT!
I don't really care whether James Damore got his message out using PragerU or the Washington Post as a conduit. What matters to me about this is how the company treated the guy, who appears to have been genuinely concerned about making Google a better workplace environment for both himself and his co-workers, and got squashed in the process by a company that doesn't tolerate anything other than their sanctioned way of thinking. It's unfortunate that a company that treats its workers this way appears to be treating their customers (you and me) the same way as well.

It's also unfortunate that there are people who want to leftify and alt-rightify everything int he world, including this guy's predicament. I'm quite tired of hearing the ideological extreme views get pontificated over and over again, especially when they do nothing to get to the facts of the situation.

Rather than reading leftwing or rightwing interpretations of what the guy intended to do, why not listen to the guy himself, and get the direct scoop without having someone else filter the news for you? Here's an interview with James Damore where he speaks his heart. Listen to what he has to say and decide for yourself, rather than just repeating what you read somewhere. It's a long video. You'll need 90 minutes to watch it. Chances are most people would rather read an opinionated 5 minute summary from their favorite politically biased news outlet.



Disclaimer: I have no idea who "The Rubin Report" is and I have no political affiliation, but I think the interviewer did a pretty good job of keeping his views out of the interview and just letting Damore speak his mind.
 
Steve A. 1/23/2018 12:40 PM
Quote Originally Posted by Enzo View Post
Quote Originally Posted by bob p
ethical dilemmas that they thought would come with self-driving cars.
I think this is one of those red herrings you like to point out.

The only ethical mission a car design could reasonably be expected to have is to save its occupants.* It would get impossibly complex expecting the car to be aware of all possible outcomes in every situation.* If the choice is to ram a small car to the side to avoid ramming a school bus ahead, how does our car know if the school bus is full of kids or just a driver?* Even if it could count the kids, would it be able to determine that hitting the big high-sitting school bus would likely be less threatening to those kids on board than ramming the small car would be to its occupant.* I can see a pedestrian and make a judgement, but would the car be able to tell the difference between a young healthy college kid who might leap out of the way versus an old lame person?* Can we reasonably expect the car to muse whether it prefers many injured school kids or one dead driver?

The car cannot be second guessing itself, it has the primary mission of providing safety to its own occupants.
With enough self-driving cars out there that scenario might never happen... there might be no need to make the decision as to which vehicle to run into.

Which brings up a new issue to this thread: many new cars have all sorts of devices which make driving much safer, like automatic proximity detection to help keep you from hitting the car in front of you. (Question: does that system apply the brakes itself or just alert the driver to apply the brakes? In the first case that would be a limited application of self-driving, correct?)


Steve A.
 
bob p 1/23/2018 12:45 PM
Quote Originally Posted by Enzo View Post
The car cannot be second guessing itself, it has the primary mission of providing safety to its own occupants.
I like the way you think. Cars should be designed that way. I'm all for taking out the young mother pushing the baby carriage to protect the car's occupant. For some reason, I don't think cars will end up being like that in the future. As the design of autonomous vehicles continues to evolve their decision making will become more complex. Eventually the algorithms will involve more complicated decision making than they're capable of making now. Eventually cars will be able to communicate with one another, and cars will know if that school bus is full or empty. There's no reason to deny that that kind of information will be utilized in the future... unless we're only interested in having a short-term discussion.
 
eschertron 1/23/2018 5:38 PM
Now, Bob, I know you are exercising sarcasm here, even without the proper emoji. I thank my human brain for that.

Quote Originally Posted by bob p View Post
I like the way you think. Cars should be designed that way. I'm all for taking out the young mother pushing the baby carriage to protect the car's occupant.
However, "this", above, is exactly how cars are designed today. Safety features for the occupants, and none for pedestrians or other obstacles. I think we need a paradigm shift in how we view cars and other rapid transit vehicles. Why even allow cars anywhere near schoolkids or young mothers? Why have them at all except to take them to the cruise-in (parked) or to the speedway (closed course).

Let's find something safer, more energy-efficient, and less of a burden on the lowest-wage-earning among us. My son works at a restaurant on the other side of town (Toledo OH, a small city compared to many) and gas alone may run him $4 or $5 a day. Add that to insurance and upkeep, and the auto culture is more an albatross around his neck than a symbol of freedom. So I agree with you; I don't think self-driving cars are the solution. I don't think any kind of autonomous minimal-passenger vehicle is a good idea, unless it's pedal-powered.

[RANT] I lived for a few months in a village outside of Cologne, (formerly West) Germany. It was easy enough to walk to the bus station take that to the mall, transfer to the train station in the basement of the shopping center, and get into the city that way. There was something appealing and creative about the mash-up of mall and train station. I imagine the overall capital expense was lower than building each separately, and the valuable farmland saved was a bonus. Contrast that with here in Ohio, where the rich soil is still being torn up and paved over to make more indoor soccer domes and shopping centers while the decay of the old abandoned retail areas sits unproductive. Why are we still so blind as to think that converting natural resources into landfill waste at ever-increasing rates is somehow equivalent to creating wealth? D@mn humans. Stupid creatures. [/RANT]
 
bob p 1/24/2018 8:51 AM
I agree that it doesn't make economic sense for low-wage people to have to spend a disproportionate amount of their income on transportation. In that context car ownership doesn't make sense, and ride sharing tends to make more sense, and public transportation makes even more sense -- but I think those things that make sense amount to rationalizing the best solution to a flawed premise -- that it makes sense for a population to commute in randomized directions.

Historically, that's not the way it worked in America. In the old days people lived in neighborhoods where they worked, and if they didn't, they didn't have to commute very far to get to their job. When people had to commute a long distance they typically did so in a synchronous way, like when people in suburbs would commute to cities by train.

Things are very different today. Individualized transportation only began to flourish in the Post WWII economic boom. As a result today most people commute rather than living in the neighborhood of their work place. Because all of this commuting is "disorganized" on a large scale, people commute in different directions. That asynchronous transportation requirement led to the state where everyone had to have their own transpiration, and things eventually evolved into a 1:1 ratio of cars to people. The US was able to tolerate that paradigm because it was a wealthy nation. Now the wealth is being extracted from America the wages of the lower and middle class have had trouble keeping up with real inflation and the commuting paradigm is becoming more difficult to support. Commuting has become prohibitively expensive for many, and people are now searching for more economical solutions to commuting; but that ignores the basic problem.

The fundamental problem is that commuting by any means is an expensive proposition. Transporting people is an energy intensive process and there are cases where it makes economic sense and there are cases where it doesn't. To make commuting (and all of the costs associated with it) economically feasible requires a good paying job. Unfortunately, as the work environment leads to fewer high paying jobs and more jobs that pay less, the economic squeeze gets tighter and tighter. Eventually people are going to have to realize to realize that commuting doesn't make sense for everyone, but that's a tough sell.

People do like to feel independent. In some respects we have to consider that having everyone commute (and own a car to do it) doesn't necessarily make sense. In the era of the Greatest Generation the type of commuting that we have today would have been unfathomable. Commuting became more common because our society had an excess supply of resources to support it. Now that economics are getting tighter and more people are beginning to become receptive to considering the environmental impact of their lifestyle, maybe we should reconsider whether having everyone commute is such a good idea. If we really want to help the environment, the answer isn't so much in changing the types of cars that we have, as much as it is alleviating peoples' needs to use them.

As an anecdotal example (Enzo might call this a red herring), I have a friend who lives 90-minutes away from his job, and insists on living outside of the city in a remote bedroom community. He commutes every day, and brags about being environmentally conscious because he drives a subcompact car that has a low impact on the environment. One of his co-workers lives in the city and drives a pickup truck. While the guy who lives in the bedroom community likes to think that he's green because he drives a small car, his adverse impact on the environment is larger than the guy who owns the pickup, because he commutes every day while the other guy does not. Of course, he preaches about being holier than thou to the guy who owns the truck, and he's not receptive to hearing about otherwise. He's done what he thinks is enough when it comes to being green, and he's reluctant to recognize that commuting is in and of itself an environmentally hostile act.
 
eschertron 1/24/2018 10:29 AM
I live close enough to my day job that I have thought about bicycling to work. Unfortunately, the suburban/rural mashup of roads between here and there combine high speed traffic and no sidewalks - and for the last 1/2mi stretch no berm between the road and the drainage ditch! Biking would be suicide with all the crazies on the road, unless I can stay ahead of them at 50MPH. I have had in the back of my head a design for an electric car (more like a horseless carriage!). One thing about here in NW Ohio, the flat flat flat land makes power requirements minimal for the kind of 'neighborhood electric vehicle' I have in mind. For travel on a flat road, the only 'real' obstacle to motion is wind resistance. I read somewhere that friction due to wind resistance is proportional to the cube of the speed. I have a spreadsheet somewhere that I figured out 'typical' HP/Wattage needs based on speed and cross-section. On the highway, cars are burning MOST of the fuel just for speed! On an incline things change dramatically. The potential energy needed to over come an elevation difference can easily swamp any other effect (just ask anyone who's ever ridden a bike).

The community I travel through to get to work recently changed the residential street speed limit from 25MPH to 35MPH. It seems ludicrous to me, especially with I-75 literally in the backyards of these neighborhoods. Is this lust for speed - at all levels - happening all over? Or am I being over-sensitive?
 
bob p 1/24/2018 12:29 PM
I think the lust for speed is happening all over. In my neighborhood I see people speeding like they're going somewhere in an emergency, even when they aren't in going anywhere important. It's just a habit. People are accustomed to driving with a leadfoot, but at the same time they complain about what it costs to pay for gas. That's never made sense to me. When gas went up to $4/gallon people bitched and bitched and bitched ... but they still drove with a leadfoot.

I get the impression that fuel conservation isn't on anybody's mind. I can't tell you how many people get pissed off as they tailgate me, because I drive the speed limit and they want to go faster. When a stoplight turns green the guy behind me is honking by the time that I've depressed the clutch and started moving the shift lever. I often wonder why everyone seems to be in such a rush. It's not as if getting to your destination 2 minutes sooner is going to have any major impact on your life.

My hat is off to you if you've considered riding a bike. Or a motorbike.


Wind resistance also has a lot to do with the frontal cross sectional area of the vehicle. I think it's a squared function relating to the cross sectional area, but I'm not sure about that. Overall, the big problem (as you know) is drag. Drag depends on cross sectional area and speed. Then there are frictional losses but those hardly compare to drag. It's a big deal. As you know, the major ways to reduce fuel consumption are to drive a more sleek vehicle and to slow down.

The last time that I drove through rural OH I was in Amish country. I saw horse drawn carriages. It's doesn't get much greener than that.
 
Enzo 1/24/2018 3:52 PM
Bicycles are real efficient, but here in Michigan, they become a non issue in all those months where the roads are snow and ice covered. SO even if the guy saves the environment half a year, he STILL has to use powered vehicles of some sort the rest of the year. Unless he moves in and out of town with the seasons. And it is tough to transport a family on one. You can get a little bike trailer, but then you need to be even more vigilant threading the minefield of sharing the road with cars.

The big trend in the area these days is "mixed use". They tear down old buildings and put up mixed-uses in their place. You know, three to six floors tall residential above unrented retail space at ground level. AVAILABLE!!! (And of course right next to the street -
lets make every street a cement canyon) The townies claim it is real efficient, you can walk downstairs to work. Well you could if all your jobs were in those little retail shops. It just isn't all as simple as that. The guy in the suburbs is not going to move downtown just so he can drive a gas waster.. He is comparing his ecological footprint with his neighbor, not the guy downtown. Downtown guy may use less fuel, but he pays higher costs for parking his truck, for buying food and sundries at downtown markets. Higher insurance, and so on. And money is from income, and that is ultimately tied to energy use in some fashion.

Speed limits are an interesting thing. We had a local artery a few years back the state wanted to raise the speed from 35 to 45. Locals howled. But studies showed that when the limit was 35, MANY cars flew by at 50. Now at 45, cars are actually averaging closer to 45. The studies also show in general if you put speed limits at what people consider reasonable, people will drive them. Like who ever actually obeys those 8MPH speed limit signs in apartment complexes?
 
bob p 2/24/2018 5:48 AM
I think those 8 mph signs in apartment complexes are more of a suggestion rather than a law. AFAIK the States don't get to regulate speeds on private property, only on public thoroughfares.

I recently talked to a friend of mine who works in one of the big publicly traded companies that is working on self-driving cars. His job isn't in self-driving cars (he's in a different part of the very big company) but he normally interacts with the autonomous car people, goes to their lectures, and stuff like that.

He described to me how the self-driving car logic is actually intended to work. The computer that's in charge of the car is actually several different computers working together. It's a sophisticated system that's designed to ISO standards, which requires that there is a third party oversight computer that monitors the car's main computer to determine if it's making an error, and to interrupt the main CPU in that sort of situation.

For demonstrating the main CPU they have a human interface (monitor) that displays what the computer is doing in terms that people can understand. As the car drives down the road the visual recognition logic assigns "containers" to every object that it identifies. If the car spots a garbage can on the side of the road, a balloon pops up that identifies the garbage can as a garbage can, and places it into a container with a garbage can container on it. Then, once the appropriate container has been assigned to the object, a set of container-based rules is applied to a very large rule matrix.

The process is very similar to how The Terminator in the Arnold Braunschweiger movies would identify a threat; this scene demonstrates the Terminator assigning "containers" to objects that it encounters:



What's interesting is that each container has logical rules associated with it, and the way that machine learning works, it analyzes millions of data points in the huge matrix, and it makes decisions based upon correlations with matrices that have been approved in the past. The process of sorting through the millions of data points in the matrices is so complicated that no human can understand exactly how or why the computer makes the decisions that it does. With machine learning, we just have to learn to trust the machine to make the right decision. (!)

The car guys ran into a problem, where the self-driving car would habitually steer to one side of the lane, preferentially hanging over the white line on the roadway. Machine learning working the way it does, there was no way to determine why this was happening. The computer had decided for some reason that it's approval matrices made that default behavior desirable and nobody could understand why. They looked into the problem for months and never found an answer... until somebody got a ride home from one of the guys who had been working on the project. That guy was one of the drivers who "trained" the computer how to drive based upon mimicking his driving style, and he liked to overhang the lane marker.

Garbage In, Garbage Out.

It's going to be a bright future.
 
Enzo 2/24/2018 6:31 AM
I just saw a Cadillac ad on TV last night, I forget what model. It was touting their new semi-autonomous driving. It showed a guy driving down the interstate at 70 with no hands on the steering. He was holding a soda in one hand and something else in the other. So they don't have door to door yet, but they are now selling a car that will drive itself down the freeway.
 
bob p 2/24/2018 7:01 AM
I've seen some youtube videos of people doing some incredibly stupid things in cars that have auto-pilot. Like reading a book while the car passes a semi on a snow-covered highway. Or a guy sitting in the passenger seat reading a book while the car drives down the road with nobody in the driver's seat to take over if the computer borks. The absolute worst case was a video where a guy and a gal were in the back seat while the car drove down the highway.

All of this reminds me of the story about the Navy Seal in a Tesla that drove right into the side of a tractor trailer because it couldn't distinguish the white trailer from the bright sky, and the Tesla drove into the side of the truck at full speed.

They have systems that drive down the freeway just fine ... until they don't.

What people fail to realize is that there is an oversight computer that monitors the decisions being made by the main driving computer. When it senses the main computer as having made a mistake, it voids the main computer's control over the system, disconnects the auto-pilot and provides an immediate hand-off to the driver. You'd better be in the driver's seat, paying attention when that happens.
 
Chuck H 2/24/2018 7:19 AM
How about an analogy? As a house painter I have to use ladders. I do have some degree of acrophobia. I have learned to control it, mostly. On ladders to 32' I'm alright and can stay cool and clear headed. Scaffolding is ok too. Things like swing staging or ladder jacks with planks are entirely out though because they're not "fixed" to anything. And neither are standard ladders, but "I" set the thing up and stabilize it and "I" control the balance point. I can't do swing or jacks even ten feet up without anxiety. I think this distinction is characteristic of many people and would account for why we don't have driverless cars yet. Not for me baby. No way. Not unless it's on a rail, in a slot or some other physical insurance against catastrophe. I never want to be out of control in an environment of potentially fatal circumstances.

EDIT: Forgot to mention boom and scissor lifts. Sort of ok with those. They can even be fun. But I'm much more likely to wuss out with the basket at far extension rolling over tilted ground. I've been chided for dropping the basket before moving over uneven ground. I mean, has anyone ever driven a boom at "this" extension over "this" uneven ground? There's the potential for a balance/ballast mistake that I don't want to participate in. Same with driverless cars.
 
bob p 2/24/2018 7:26 AM
I'm like you, Chuck. I want to be in control, and I'm not going to be willing to relinquish control to a machine. Especially given that the experts who design the machines can't even figure out how the machines are making the decisions that they are making. I just can't put faith into a system where nobody knows how it actually works. We just have to assume that the car is going to be safe in all situations because from what we've seen so far, it's behaved OK in the past. Sorry, but that isn't good enough for me.
 
nosaj 2/24/2018 7:35 AM
Quote Originally Posted by Chuck H View Post
How about an analogy? As a house painter I have to use ladders. I do have some degree of acrophobia. I have learned to control it, mostly. On ladders to 32' I'm alright and can stay cool and clear headed. Scaffolding is ok too. Things like swing staging or ladder jacks with planks are entirely out though because they're not "fixed" to anything. And neither are standard ladders, but "I" set the thing up and stabilize it and "I" control the balance point. I can't do swing or jacks even ten feet up without anxiety. I think this distinction is characteristic of many people and would account for why we don't have driverless cars yet. Not for me baby. No way. Not unless it's on a rail, in a slot or some other physical insurance against catastrophe. I never want to be out of control in an environment of potentially fatal circumstances.

EDIT: Forgot to mention boom and scissor lifts. Sort of ok with those. They can even be fun. But I'm much more likely to wuss out with the basket at far extension rolling over tilted ground. I've been chided for dropping the basket before moving over uneven ground. I mean, has anyone ever driven a boom at "this" extension over "this" uneven ground? There's the potential for a balance/ballast mistake that I don't want to participate in. Same with driverless cars.
I remember the first time I did ladderjacks and walkboards on 40ft ladders on Pensacola Beach hanging seamless gutters. I could not get used to the bounce and sway when you moved and the winds blowing.

nosaj
 
bob p 2/24/2018 8:04 AM
Quote Originally Posted by Enzo View Post
The only ethical mission a car design could reasonably be expected to have is to save its occupants. It would get impossibly complex expecting the car to be aware of all possible outcomes in every situation. If the choice is to ram a small car to the side to avoid ramming a school bus ahead, how does our car know if the school bus is full of kids or just a driver? Even if it could count the kids, would it be able to determine that hitting the big high-sitting school bus would likely be less threatening to those kids on board than ramming the small car would be to its occupant. I can see a pedestrian and make a judgement, but would the car be able to tell the difference between a young healthy college kid who might leap out of the way versus an old lame person? Can we reasonably expect the car to muse whether it prefers many injured school kids or one dead driver?

The car cannot be second guessing itself, it has the primary mission of providing safety to its own occupants.
I asked my friend at the big tech company about this. I mentioned the BBC's documentary and the "ethical dilemma" of having the car decide who to "victimize" in an unavoidable accident. and I mentioned your premise that the only ethical mission that a car design could reasonably be expected to have was to save his occupants.

He laughed. Then he said, "The only ethical guidelines that are built into self-driving cars is a mandate to prevent any design issues that could result in the car company getting sued."

I asked about who the car would choose to kill in one of those unavoidable accident scenarios. "The occupants," he replied.

No kidding.

You see, from our perspective as SciFi fans, we adore SciFi and we look forward to living in a world that lets us be a part of the Utopian fantasy future, like Star Trek. We're sort of like idealists in that regard. We're the kind of people who like to think that in a perfect world, Asimov's Three Laws of Robotics would be programmed into any self driving car:

Quote Originally Posted by Handbook of Robotics, 56th Edition, 2058 A.D.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
But the reality is that because for-profit corporations are building robotic cars, they're more likely to be built according to Robocop's Prime Directives:

Quote Originally Posted by Robocop's Prime Directives
1. Serve the Public Trust
2. Protect the Innocent
3. Uphold the Law
4. Classified
That 4th item is the problem. Invariably, corporations plan on building products with the intent that their products won't harm their creator.



My friend told me that in the unavoidable accident scenario, the cars will never be allowed to make decisions about who to kill, they'll just try to stop the car and if that doesn't work, the occupants get killed, problem solved, reboot. The cars will never take evasive actions to protect the occupants at the expense of an innocent third party, as that would only create liability for the car company. He further opined that if the car were to harm an innocent bystander, that person and/or their family will sue the car manufacturer. On the other hand, if the car kills it's passengers, it effectively eliminates the people most likely to file a lawsuit against the company. I thought it was ironic that the people at the company who builds these things think that way. They definitely don't share the childlike SciFi fantasy of a Utopian future. They're more worried about CYA than anything Utopian.
 
Chuck H 2/24/2018 8:43 AM
Quote Originally Posted by nosaj View Post
I remember the first time I did ladderjacks and walkboards on 40ft ladders on Pensacola Beach hanging seamless gutters. I could not get used to the bounce and sway when you moved and the winds blowing.

nosaj
Where's the "don't like" tag? I'd have to keep spare underwear in the truck
 
Dave H 2/24/2018 10:37 AM
Quote Originally Posted by Chuck H View Post
Where's the "don't like" tag? I'd have to keep spare underwear in the truck
Why, what could possibly go wrong?

[IMG]https://riversonghousewright.files.wordpress.com/2012/02/img_8031.jpg[/IMG]
 
Gnobuddy 2/24/2018 9:47 PM
Quote Originally Posted by bob p View Post
...self-driving cars...sensors on those cars failing to accurately identify cars that are painted with black paint.
Apparently they don't see white too well either. Joshua Brown died because of combination of his own stupidity, and the failure of Tesla's "Autopilot" to see a white truck crossing the road in front of the car against a bright sky:

Tesla wrote in a blog post after the crash that the Autopilot system did not notice "the white side of the tractor trailer against a brightly lit sky, so the brake was not applied."
If you want to read the full context, that quote was taken from here: Tesla Autopilot played a role in fatal crash: NTSB - Business Insider

It's not just a matter of white paint, or black paint, or dirty cars. What happens when you drive through a mud puddle (or go through a carwash), and a couple of the sensors in your self-driving car become blinded by mud or confused by moisture? What happens if one of the cost-engineered sensors fail while the car is on autopilot? What happens if the car experiences a simple mechanical failure - say one of the balljoints that hold the front wheels onto the control arms fails - while on autopilot?

A reasonably competent human driver will find ways to minimize the extent of the damage in all of the above scenarios. I once had a front wheel fall off a car I was driving on the freeway at 65 mph; I brought it to a stable stop, driving on three tires and one brake rotor.

A self-driving car? It won't have a clue, and will probably kill the occupant.

As for ethical considerations made by self-driving cars, come on, people. You are vastly overestimating the capabilities of AI software. There won't be any ethical considerations. Self-driving cars are literally dumber than cockroaches (think about that for a moment.) Cockroaches don't make ethical decisions or have any self awareness. Neither do self-driving cars.

-Gnobuddy
 
bob p 2/25/2018 1:56 AM
I'm amazed that the NHTSA gave Tesla a pass on the Brown catastrophe, citing that Tessie's failure to apply the brake was not a safety issue because the Tesla was not designed to be "cross-traffic-aware." I imagine that people are going to run into the same problems with the new Cadillacs that Enzo was talking about.

I think it's obvious that visual recognition is a bad paradigm for a guidance system. The military prefers radar, GPS and targeted LASER.

With visual recognition bit depth is part of the problem, especially when you're relying upon bid depth in a single color channel (like black and white). When it comes to recognizing colors there are actually 4 channels contributing data to the image instead of one. If you're trying to recognize an object that doesn't have intense midband contribution in the RGB channels then you're left with fewer bits of data to interpret. The result is that white trucks and white sky look the same because there isn't much contrast.

When I watch digital cable TV I see digital color banding because the cable company down-samples the number of color bits in the signal to maximize their bandwidth. The result is that different shades of white (or different shades of black) are poorly represented on-screen and typically display large blotchy color-banding. If you've ever seen this then you know what I'm talking about. It's particularly bad in scenes that involve a dark background with a flashlight. Lack of color depth is precisely why a Tesla couldn't discriminate between a white semi trailer that was a silhouette against a bright sky.

The problem with color camera recognition is that it's an inherently inferior technology that is being deployed because it's the cheapest solution to the problem. Everyone knows that radar would be better, but car manufacturers aren't interested in using radar because a camera is cheap while radar is expensive. The color recognition package is a cheap chip sold by nVidia. Radar systems are expensive.

The military had the exact same problems with visual recognition systems when they were initially developed for tanks. The found that visual recognition worked just fine on bright sunny days (the conditions under which the system was developed and tested) but failed completely in adverse weather conditions because the system just couldn't resolve the landscape under conditions it had never been trained to recognize.

No autopilot for me. The real problem for me is going to be the numskulls on the road that embrace the technology and place me at risk for an accident.
 
bob p 2/25/2018 2:05 AM
Quote Originally Posted by Gnobuddy View Post
What happens if one of the cost-engineered sensors fail while the car is on autopilot? What happens if the car experiences a simple mechanical failure - say one of the balljoints that hold the front wheels onto the control arms fails - while on autopilot?
Some sensors spew data out in real time onto the CAN bus. The oversight computer does have the ability to recognize when CAN sensors go offline and trigger an error signal. The problem is that balljoint doen't have a direct connection to the CAN bus so it's unlikely to be recognized directly when it "disappears."

If the car does have an active suspension, then the suspension should recognize an abrupt change at that corner. Whether or not the autopilot recognizes the problem may be entirely dependent upon whether that problem has been anticipated, and an appropriate response designed into the decision matrix.

Chances are that's not going to happen, because a lot of what's being done with cars is centered around machine learning. Machine learning requires previous exposure to a situation in order to be trained how to handle it. Chances are that engineers won't be designing new cars with contingency plans for wheels falling off. They want you to buy another new car before that happens, and most wheels only fall off on cars that are out of warranty. It's not their problem.

The Brown fatality shows us that there's going to be a level of immunity provided to the car manufacturers when cars are "misued" on autopilot. The Tesla autopilot was never intended to be used the way that Brown chose to use it. Even the government's best finger-pointing agency, the NHTSA, determined that the Tesla crash was not Tesla's problem because it did not present a safety issue. Why? Because the Tesla was never designed to be cross-traffic-ware; the result is that Brown's death was determined to be Brown's fault. As a result it seems doubtful that his surviving parents would have any chance of recovering a penny from Tesla.
 
Chuck H 2/25/2018 6:43 AM
Quote Originally Posted by bob p View Post
The Brown fatality shows us that there's going to be a level of immunity provided to the car manufacturers when cars are "misued" on autopilot. The Tesla autopilot was never intended to be used the way that Brown chose to use it. Even the government's best finger-pointing agency, the NHTSA, determined that the Tesla crash was not Tesla's problem because it did not present a safety issue. Why? Because the Tesla was never designed to be cross-traffic-ware; the result is that Brown's death was determined to be Brown's fault. As a result it seems doubtful that his surviving parents would have any chance of recovering a penny from Tesla.
I think it has to depend on how clear the autopilot limitations are presented. Of course it may not. I hate to think this accident was a result of failing to "read the fine print" and that qualified Tesla for immunity. That autopilot won't recognize cross traffic and avoid it is a pretty big deal, I think!?! It should be in blinking, illuminated script on the dash whenever autopilot is engaged! But...

If it IS clear that Teslas autopilot has this cross traffic limitation I'd consider it on the same level as, say, MacDonald's (with another caveat coming up ). That is, of course a diet of Big Mac's and french fries is going to kill you. It's up to YOU to be aware of the requisite use and implementation of the product. In the case of fast food that would simply mean moderation. BUT...

I see the Tesla case as different for the very good reason that a guy who kills himself eating junk food isn't specifically placing others in danger. Meaning that an idiot who can't manage the proper care and use of a Big Mac (or Big Macs) is not a threat to anyone but himself. But put that same idiot behind the wheel of a Tesla with autopilot and EVERYONE is in danger. For that I think Tesla should be held accountable. This moral position of being responsible for the safe distribution of your product when it can endanger others in the hands of the buyer is pretty standard. It's the reason they don't sell guns to children. So, on this premise I can't believe Tesla got away with it. Good lawyers.
 
Enzo 2/25/2018 9:35 AM
Up front I establish I am not a fan of self driving cars.

But one thing about them is the technology is bounding ahead at great speed. By that I mean that it is maturing fast and learning a ton with each passing day. I do not mean it is getting ahead of itself. Some systems may push it too far, but a problem from a year ago is old news to the engineers that are working on them.

True or not, we have heard the stories of a guy driving an RV who put it on cruise control then stepped into the back to grab a soda. If people make unrealistic assumptions, systems will fail to live up to expectations. Re the Tesla event, even the driver said he admits to fucking up.

Here is one of the ads for the Cadillac. I recognize the road he is on, I-96 near Howell.

 
g1 2/25/2018 7:10 PM
Quote Originally Posted by Chuck H View Post
put that same idiot behind the wheel of a Tesla with autopilot and EVERYONE is in danger.
Maybe. Or maybe they're in less danger than if he didn't have the Tesla driving aids. He was watching a DVD. Maybe he would have anyway, maybe it would have been a school bus full of kids instead of a truck.

All day long we drive under the mindless assumption that the other drivers will be doing a good job and obeying the rules. Many times they do not. The casualty rate is quite bad. The road is a likely place to die.
The idea that self-driving cars need to be perfect is a straw-man argument. The status quo is far from perfect.
They do need to be significantly safer than human driven. That doesn't seem so unreachable.
We all think of ourselves as good drivers. What we forget to think about is the guy coming the other way trying to text while driving into the sun.
Weather permitting I'm on a motorcycle. It gives bit different perspective on the caliber of the average driver when you're that vulnerable. The 'average' driver is not a good driver. They're excellent consumers of infotainment systems though.
Bring on the machines, let the people do what they want (it's not driving). Those of us that do enjoy driving are the exceptions, not the rule.
 
Chuck H 2/25/2018 8:34 PM
Quote Originally Posted by g1 View Post
The idea that self-driving cars need to be perfect is a straw-man argument. The status quo is far from perfect.
They do need to be significantly safer than human driven. That doesn't seem so unreachable.
That's a very strong argument and it makes perfect sense from a statistical perspective, BUT...

If WE (the collective) are out there on the roads killing ourselves, we can accept that. Not exactly like that, but there is usually a pin man or a cause we can understand (if not exactly accept). As humans we accept the risk personally and under our own control. Even if the driverless cars were twice as safe as the current road fatalities I would still have no desire to put myself into a potentially fatal activity where no one personally invested in my survival is in control. That seems like a perfectly sane and reasonable survival instinct even if the statistics don't agree. Because "I am not a number! I am a free man!" (I know you'll get that one )

Also consider how it goes when there IS a fatality. If a drunk driver runs over a cyclist people thing it's a shame and that driver is removed from the system (ideally, I know). But if a driverless car malfunctions and runs over a cyclist that's a whole different thing because of perception. Now we have an evil corporate machine that wasn't made safe enough because "the company" is trying to maximize profit. And "OH HOLY HELL!!!" They actually have statistical reports that discuss an acceptable amount of fatalities!?! As in "Our driverless cars are so safe that they only kill "X number" of people per year." Or maybe "OUR cars kill fewer people than THEIR cars."??? It seems absurd to me. Even the consideration. But only because it requires such a leap outside of human instinct to absorb and accept being a statistic for the greater good rather than an individual taking control of their own destiny. And then there's this!..

I don't know if it was earlier in this thread, I think it may have been another where I mentioned that with the advent of driverless cars will inevitably come "Advanced Safety Options" (ASO) like paying extra to be preferred in the event of a potentially dangerous circumstance. You can't even pretend like that wouldn't happen. Even if the competition had to learn how to bust transmit code on competitors products, they would. If there was any risk to the customer the ASO would attempt to interfere with objectivity and attempt to make surrounding autonomous vehicles do the same to the preference of their client. Maybe I'm cynical, but I have a hunch I won't be able to afford the ASO. So now how are my statistics?
 
bob p 2/25/2018 9:50 PM
Quote Originally Posted by Enzo View Post
Re the Tesla event, even the driver said he admits to fucking up.
Joshua Brown was killed in the accident. I wonder how he made his admission from the grave.
 
bob p 2/25/2018 9:53 PM
Quote Originally Posted by Enzo View Post
I noticed that in part of the commercial, the car was in the left hand lane and going so slow that cars were passing in the two lanes to the right. That's a pretty bad way to operate a self-driven car. Slower traffic belongs on the right.
 
bob p 2/25/2018 10:08 PM
Quote Originally Posted by Chuck H View Post
I would still have no desire to put myself into a potentially fatal activity where no one personally invested in my survival is in control. That seems like a perfectly sane and reasonable survival instinct even if the statistics don't agree. Because "I am not a number! I am a free man!"
My sentiments exactly.

And I'm old enough to have caught the Prisoner reference.



Also consider how it goes when there IS a fatality. If a drunk driver runs over a cyclist people thing it's a shame and that driver is removed from the system (ideally, I know).
The problem is that the first driver to be removed from the system will be the cyclist; unfortunately the drunk driver may or may not be removed. Chances are they will survive to do it all over again.

I have a problem with the excuses that cage drivers get away with when they run over motorcyclists. "I didn't see them." Irrespective of whether they saw them or not, it was their responsibility to see them and their responsibility shouldn't go away with a lame-ass excuse like that, but all too often it does.

As a result, people who ride cycles have to be extra vigilant and assume that every cage out there is on the road with the specific intent to kill them. Keeping yourself safe requires that a cyclist intentionally keep himself as far away from other vehicles as possible. I routinely break traffic laws to put a safe distance between me and other drivers. It's suicide not to.

I wonder how long we'll have to wait read an NHTSA report in which a self-driving car is exonerated for killing a motorcyclist because the self-driving car was never intended to be motorcycle-aware.


I don't know if it was earlier in this thread, I think it may have been another where I mentioned that with the advent of driverless cars will inevitably come "Advanced Safety Options" (ASO) like paying extra to be preferred in the event of a potentially dangerous circumstance. You can't even pretend like that wouldn't happen. Even if the competition had to learn how to bust transmit code on competitors products, they would. If there was any risk to the customer the ASO would attempt to interfere with objectivity and attempt to make surrounding autonomous vehicles do the same to the preference of their client. Maybe I'm cynical, but I have a hunch I won't be able to afford the ASO. So now how are my statistics?
I could see privateers and system hackers selling that service. I'd buy it.
 
Enzo 2/25/2018 10:18 PM
I wonder how he made his admission from the grave.
I recalled such an admission. I may be incorrect then.

If he was watching a DVD as reported above, then perhaps it was someone else who claimed he was not using the system correctly, rather than he himself.
 
bob p 2/25/2018 10:54 PM
I think you're right -- somebody else figured out that he was watching a DVD and was not paying attention to the drive.

The problem in that accident (as I see it) is that the Tesla kept sounding the alarm to get driver interaction, it didn't get it, and it kept on going. I'm thinking that if a car wants driver to take the wheel and that doesn't happen immediately, then the car needs to stop. It should definitely not keep on going in an alarm situation.

Then you've got the problem of determining how a car that's driving in the left lane, going full speed, can stop itself without causing an accident by making a sudden stop in traffic.
 
tedmich 2/25/2018 11:03 PM
Quote Originally Posted by Chuck H View Post
MacDonald's (with another caveat coming up ). That is, of course a diet of Big Mac's and french fries is going to kill you. It's up to YOU to be aware of the requisite use and implementation of the product. In the case of fast food that would simply mean moderation. BUT...
That ubiquitous Scottish restaurant maintains officially that their food is intended to be a "rare treat" and thus they bear no responsibility for the hideous health decline that would accompany its REGULAR consumption*

*something their yearly $963 million ad budget tries to cause continuously

As to autonomous vehicles, no one loses their job with self driving personal cars so these will come slowly... look for fleets of robot semis and cabs to appear much sooner, as they are being tested nationwide.
 
bob p 2/25/2018 11:06 PM
I want to know why I can't get Scotch Eggs at Mickey D's.
 
Leo_Gnardo 2/25/2018 11:18 PM
Quote Originally Posted by bob p View Post
I want to know why I can't get Scotch Eggs at Mickey D's.
You can, sort of, with a little ingenuity. Stop in for breakfast, order a couple sausage egg McGuffins. Take the sausage patties, put an egg in between them, that's about as close as you're gonna get. Cheese optional. Wash down with a slug of Laphroaig for an authentic touch. Hoot mon, what a way to start the day! You'll be ready to toss the caber.

tedmich, "ubiquitous Scottish restaurant", geez that's funny! My first laugh for Monday, and we're only an hour in.
 
Enzo 2/26/2018 12:23 AM
McHaggis!!!
 
Chuck H 2/26/2018 12:41 AM
Two all beef patties, special sauce, lettuce, cheese, pickles and onions in a sesame seed sheeps belly
 
Enzo 2/26/2018 12:56 AM
It takes guts to make that stuff.
 
Dave H 2/26/2018 3:52 AM
Quote Originally Posted by bob p View Post
And I'm old enough to have caught the Prisoner reference.
I've been to "The Village". It's not far from here, it's Portmeirion in North Wales. No 6's house is tiny, much smaller than it looks on film. The boat in the harbour is made of concrete. There was no sign of Rover on the beach (but I wasn't trying to make a run for it).

[ATTACH=CONFIG]47309[/ATTACH]
 
bob p 2/26/2018 10:21 AM
Quote Originally Posted by Enzo View Post
It takes guts to make that stuff.
It takes more guts to EAT it.
 
Enzo 2/26/2018 2:21 PM
Apology in advance for the tangent...

There is a show on Food Network called CHopped. The contestants are given a basket with four mystery ingredients they have to include in an entree they create. usually one of the ingredients is not really well suited, thus a challenge. For example: squab, shallots, green beans, and peppermint candy.

One episode included prunes, and canned haggis. (really, they can the stuff too?) A panel of judges then critique the various dishes created. As I walked by the TV, I heard one judge say "The haggis to prune ratio is way off here..." Not a phrase I would ever expect to hear anywhere.
 
Leo_Gnardo 2/26/2018 3:03 PM
Quote Originally Posted by Enzo View Post
One episode included prunes, and canned haggis. (really, they can the stuff too?) A panel of judges then critique the various dishes created. As I walked by the TV, I heard one judge say "The haggis to prune ratio is way off here..." Not a phrase I would ever expect to hear anywhere.
I can see where the prunes come in handy. Haggis, you might want to send that awful offal through ya fast as possible...

But on a slightly more serious note, all haggis amounts to is sausage meat or scrapple made out of sheep instead of swine. Everybody I know that's tried it has survived just fine, some even said "I don't see what the fuss is about, it was actually pretty good." All depends how you spice it up I guess. As a final line of defense, keep a bottle of Frank's hot sauce ready to hand - hey I put that stuff on everything! As the offspring of generations of Scots, I have to credit them for carrying on long enough to produce me - likely most of them had to consume some haggis to survive.
 
Enzo 2/26/2018 3:41 PM
Everyone has heard of haggis, even if not seen it. Scrapple is usually only known to people from the middle eastern seaboard. I love scrapple. But haggis is more fun to make fun of.
 
eschertron 2/26/2018 5:40 PM
Since this thread's taken us deep in the weeds off the side of the road
I sometimes stop by the freezer section of the local supermarket just to spy the tubs of chitterlings. Nobody's ever heard of chitterlings, but many folks would recognize 'chitlins' by name, if not by sight.
 
bob p 2/26/2018 6:05 PM
Quote Originally Posted by eschertron View Post
Since this thread's taken us deep in the weeds off the side of the road
Entirely befitting, as self-driving cars aren't all they're cracked up to be.

I sometimes stop by the freezer section of the local supermarket just to spy the tubs of chitterlings. Nobody's ever heard of chitterlings, but many folks would recognize 'chitlins' by name, if not by sight.
Buy 'em in bulk!

[IMG]http://music-electronics-forum.com/attachment.php?attachmentid=47313&amp;d=1519694285[/IMG]


So were you just cruising by the freezer section to look at the tubs of chitlins or were you actually there to buy one?

I have to admit, I haven't ever made them, though I have eaten them at friends' houses more times than I could count.

And it's a coincidence that I'm listening to this right now ... the same record was spinning when I was eating chitlins in the late 70s.

 
Gnobuddy 2/26/2018 7:17 PM
Quote Originally Posted by g1 View Post
They do need to be significantly safer than human driven. That doesn't seem so unreachable.
I know we hear from Google and other rich corporations that this has already been achieved.

Let's think about this for a second. Let's start with a rough ballpark calculation of the computational ability of a human brain.

The human brain is estimated to have a hundred billion neurons (that's 10^11). Each neuron can fire up to about ten times per second (that's 10^1). Each neuron can also wire itself up to maybe a hundred or thousand other neurons (10^3). Multiply those together, and you get 10^15: our rough estimate of the number of operations per second that the human brain can process.

I have seen other estimates of up to 10^18 operations per second, which is a thousand times faster than 10^15. I don't know if anyone knows for sure which of these numbers is more accurate; but, to be conservative, let's go with the lower estimate, only 10^15 operations per second.

Now let's look at a typical powerful computer CPU of today. You typically have four cores each operating at around 3 GHz (that's 3 x 10^9 times per second). Some operations take several clock cycles, but let's be generous, and overestimate the CPU's capability; let's pretend it can do one operation per core per clock cycle.

So now we multiply those two numbers, and we get 1.2 x 10^10 operations per second.

Here is the point to note: 10^10 is a hundred thousand times smaller than 10^15. The fast modern computer is a hundred thousand times less capable than the human brain.

We may not know exactly what intelligence is, but it is definitely connected to the brains computational ability.

Clearly, we can't expect a modern PC CPU to have human intelligence, or anything near it.

So what animal(s) on earth are a hundred thousand times stupider than a human? Insects, apparently. Maybe worms.

So our expensive desktop CPU has - maybe - the smarts of an earthworm. Maybe, just maybe, the smarts of a cockroach. And the CPUs in self-driving cars are probably less capable than your powerhouse $400 Intel gamer CPU - economics will see to that. So your self-driving car is probably stupider than a cockroach, as I've said on this forum a few times. This is not an exaggeration, this is a plausible estimate of the truth.

Those of us who've driven for many years know that, a lot of the time, driving may not actually require a lot more intelligence than a cockroach has. Keep in the center of the lane, keep a safe distance behind the vehicle in front. A cockroach brain could probably handle that - which is why Google, Tesla, et cetera have successfully demonstrated autonomous cars.

But those of us who've driven for many years also know that every so often, a situation comes along that requires human-level smarts. Like the time in the early 1990's when the little girl on her new bicycle pedalled down her driveway, all the way into the roadway, and straight into the path of my car (she came from the right side of the road.)

There was no time to stop, the little girl obviously couldn't control her bike so my horn was useless, there was no oncoming traffic, so I made an emergency double lane change over to the *wrong* side of the road. The extra distance gave me time to stop completely - with my left-side wheels almost touching the left curb of the street. A fraction of a second after I had fully stopped the car, the little girl bicycled into the side of my car, still out of control, still at full speed.

She fell over and started to cry - but because of the actions I'd taken, she only made contact with a completely stationary object (my car), instead of being mowed over by a tonne and a half of metal moving at 35 mph. She had a few little scratches on her hands and legs, but that was all.

Her parents didn't even notice what was happening until their child was already in the street. Then they rushed to her, picked her up, glared at me blackly for a few minutes, and left without so much as a "Thank you!"

Part of what made it possible for me to avoid running over that little girl, was the fact that as I rounded the corner onto that residential street, I'd spotted the little girl standing by her bicycle near the end of the driveway. Her parents were nearby, talking to each other, distracted, not noticing. Since I'm a human being, I know kids are unpredictable, and make mistakes. So I was instantly on the alert, before she started rolling down the driveway, headed for the street and my car.

Would a cockroach have known any of this? Of course not. Would one of Google's cars? Of course not. Would a self-driving car know that this was a situation when driving over to the wrong side of the road was the right thing to do? Of course not.

No problem, Google has billions of dollars in cash, and rooms full of lawyers, so the courts would give them a pass. It was the little girls fault, or maybe her parents. Too bad she was killed by the self-driving car, but of course, it wasn't the car's fault. No siree.

I've been driving for quite a few years, and I have other stories like this, of accidents I narrowly averted by thinking my way out of a tough situation, or anticipating an incipient human failure; I'm sure you, and probably everyone who's driven for a decade or two, has similar stories.

So no, I do NOT believe that self-driving cars will be safer than human drivers. There is a crap-load propaganda that says so, but it is all coming from the people who want to sell us self-driving cars, not from independent studies. I think cockroach-stupid cars will, at best, cope with cockroach-stupid driving situations, but that is all; when the emergency situation requiring actual intelligence comes along, it will fail utterly. And the courts will give the manufacturer a pass. (Ask the ghost of Joshua Brown, RIP).

Don't drink the Kool-aid, until we have at least a decade or two of real-world statistics to draw from, and it stops being Kool-aid, and becomes actual data. Until then, be very, very very sceptical. (Ever seen *any* computer software that doesn't fail spectacularly from time to time? Me neither.)

-Gnobuddy
 
Chuck H 2/26/2018 8:20 PM
Your perspective seems a little extreme. As certainly cockroaches know how to not run into each other and make safe, forward progress. g1's point is that if ALL vehicles were driverless then EVERY CAR is following the same rules. And THAT'S how you make it safer than human drivers. It could absolutely work. No doubt in my mind, BUT!!!... It's your argument about autonomy that supports my position. I would honestly rather be accountable for my own risk than EVER subject myself to becoming an autonomous death statistic. And I'm pretty sure I'm not alone. Still, there's no doubting that g1 is right about the likelihood of traffic fatalities decreasing with complete automation. People are VERY inconsistent and will never manage the road as well as computers could IF all vehicles on the road were working with the same format. No doubt at all.

So let's walk another path for a second. How about natural selection. Since traffic fatalities account for a large percentage of human culling, and we are currently responsible for handling our own problems on the road, be it our own level of prowess or attentiveness or awareness and avoidance of dangerous drivers and circumstances, this still amounts to natural selection. So what happens to our capabilities as a species if we allow ourselves to be utterly autonomous in this!?! I don't think that's a strictly contrary position to take. If, indeed there are fewer traffic fatalities AND, due to autonomy there is a higher ratio of those who wouldn't (I might argue shouldn't) survive that are spared the potentially fatal responsibility of their own capabilities, well, I think you see where I'm going with this. Even more people and a higher ratio of those less capable of surviving on their own accord. Can that end well? I know this is the modern age where every kid gets a ribbon. I think that direction is going to end our species.
 
Enzo 2/26/2018 9:46 PM
Actually, smoking kills ten times the number each year of traffic fatalities.


Gno, I think I agree with a lot of what you said, but the human brain is busy, pumping the heart and breathing the lungs, thinking about its day at work, reliving an argument from the night before, and so on. it is maintaining a memory of all its passed life, and accumulated knowledge. When I drive, I have not forgotten all I know about amplifiers, or how to make meatballs, or Woodstock. My brain is remembering 70 years of stuff, the auto-car only needs to remember the program in its ROM. I don't think comparing the car to the brain is a valid comparison. After all your cell phone has way more computing power than the computer that flew crews to the moon. Yet that little computer got us a quarter million miles away and back multiple times.


Auto-cars scare me, but I have to admit, they will likely wind up safer. Like a lot of people are afraid of flying, but are not afraid of driving down the highway to get to the airport. And that drive is far more likely to kill you.
 
bob p 2/26/2018 10:00 PM
Quote Originally Posted by Chuck H View Post
... the likelihood of traffic fatalities decreasing with complete automation. People are VERY inconsistent and will never manage the road as well as computers could IF all vehicles on the road were working with the same format. No doubt at all.
The person who imagines that selfdriving cars are going to provide us with a new era of safety is putting a lot of faith in a population of vehicles that are acting independently, asynchronously, in a chaotic fashion, without any common goal and without any common means of oversight. If driverless cars are going to be safe, they're going to have to be synchronously controlled by a master computer that coordinates the activity of each independent vehicle so that they work together to optimize the flow of traffic as they communicate with one another to avoid accidents -- there will need to be oversight and control of the vehicles by a master application that functions along the lines of an Air Traffic Controller.

Unfortunately none of the cars on the road today are even close to that sort of implementation. Right now the car manufacturers are boasting about their ability to steer a car while keeping it between two while lines. They seriously overestimate the car's ability in the minds of the car buying public by mis-using terms to inspire a false sense of confidence in the buyer. An example of this would be equating basic lane-confinement with autopilot, while the reality is that the two aren't even close to one another.

The need for synchronized behavior is obvious -- when the wheel falls off of his car as he's driving down the highway, he needs to stop. His car needs to recognize the condition and bring the car to a stop. But the other traffic needs to slow down and stop as well so he doesn't get rear-ended. Having another car that's independently guiding itself down the roadway with a camera isn't good enough. When one car suffers a critical failure, it needs to signal that condition to the other cars so that all of the cars on the roadway will act with a common goal to slow down and stop to avoid a multiple car pileup. The current paradigm of relying upon optical sensors and sounding an alarm inside of the cabin isn't going to cut it -- handing control over to a driver who hasn't been paying attention just doesn't provide a fast enough response.

Right now we have a situation where independently driven cars are being driven with an asynchronous plan. At most, they are synchronized in that they are travelling int he same direction on the same road, and that's about it. The drivers typically aren't working together. The current driverless paradigm replaces a population of humans that are driving without a coordinated plan with computers that are driving without a coordinated plan. Where's the safety in that? There's just no reason to assume that driverless cars will be any safer than independently driven cars until the driverless cars communicate with each other to make their actions on the road more safe.



How about natural selection. Since traffic fatalities account for a large percentage of human culling, and we are currently responsible for handling our own problems on the road, be it our own level of prowess or attentiveness or awareness and avoidance of dangerous drivers and circumstances, this still amounts to natural selection. So what happens to our capabilities as a species if we allow ourselves to be utterly autonomous in this!?! I
Natural selection begins the moment you board the Johnny Cab.
 
Enzo 2/26/2018 10:15 PM
Why do they need to be controlled globally from a central facility? The driven cars of today certainly function and have no such control. If I am driving to the mall, and you are driving home from work, we have different goals and intentions, yet a set of rules allows us all to function together.

One thing I see every day is the guy turning left from the second lane (or right). I see the guy racing up on my left to cut across two lanes in front of me to get to the exit ramp 5 seconds faster than getting behind me. I see guys decide to make it through the yellow light even when it has turned red. I see cars wandering in and out of their lane while the operator is gawking at his smart phone. Or even a book. I see cars going around flashing lowered gates at railroad crossings. On a country road yesterday, some nitwit screamed past me crossing the double yellow line on a blind hill, then passed the next car also on a hill that curved, again crossing the dual yellow line. The list goes on. I suspect very few auto-cars will do those things. The particular technology is advancing very fast.
 
Gnobuddy 2/27/2018 12:48 PM
Quote Originally Posted by Chuck H View Post
Your perspective seems a little extreme.
Maybe, time will tell. Certainly we need a counterbalancing perspective other than the idiotic rah-rah cheering we're hearing from manufacturers, legislative authorities, advertisers, and talking heads on TV.

Quote Originally Posted by Chuck H View Post
As certainly cockroaches know how to not run into each other and make safe, forward progress.
They do run into each other, also over each other - I have had the disgusting experience of witnessing a roach infestation in a neighbour's barn once.

Have you ever seen a cockroach or beetle that was accidentally flipped over onto its back? Beetles die that way, of exhaustion and starvation, because they are too stupid to do anything more than wave their legs helplessly in the air for hours on end. With no comprehension of the world around them, they can't adapt to an unexpected situation. Why would we expect a cockroach-stupid self driving car to do any better?

Quote Originally Posted by Chuck H View Post
g1's point is that if ALL vehicles were driverless then EVERY CAR is following the same rules.
That's not how AI systems work. Researchers tried to create rule-based AI systems in the fifties, sixties, and maybe seventies. They quickly found that this approach only worked for extremely simple problems - beyond that, the number of required rules multiplied so fast that you could never have enough rules.

I'm by no means an expert in AI, but from what I understand, today's AI systems are "taught" with reward and punishment. As an analogy, you take your cockroach, and hook it up to a driving simulator, so that every twitch of the cockroach's legs or antenna causes the car to turn left, right, brake, or accelerate.

Now, you let the cockroach move. If the car does something it shouldn't (runs into a tree, say), you drip dilute acid on the poor cockroach (Bad cockroach! Bad, bad cockroach!) If the car stays in the lane and moves forwards, you give the cockroach a grain of sugar (Good boy! Good, good cockroach!).

Now you repeat this a few hundred thousand times, and then proudly call all the TV reporters you can, and announce that you have just created a fantastic self-driving artificial intelligence system and put it in the greatest car ever made.

Notice that the humans torturing the cockroach actually have no idea what, if any, "rules" the cockroach is following. The cockroach has no idea what it's doing, either - it's just trying to avoid being burned with acid.

In case you think I'm exaggerating - consider that in 2004, a petri dish containing about 25,000 neurons from a rat brain was taught to fly a (simulated) aircraft using exactly this reward / punishment approach (except they used an applied voltage rather than acid and sugar):

1) https://www.newscientist.com/article...fighter-plane/

2) Extracts - "Brain" In A Dish Acts As Autopilot Living Computer

3) https://news.nationalgeographic.com/...etri_dish.html


The first of those links contains the sentence "The cells could one day become a more sophisticated replacement for the computers that control uncrewed aerial vehicles", exactly the kind of stupid response I've come to expect from half-educated reporters without a clue. Seriously, what moron would trust a drop of gloppy rat brain cells with the intelligence of a flatworm to fly a lethal amount of weight around humans or human property?

Quote Originally Posted by Chuck H View Post
Still, there's no doubting that g1 is right about the likelihood of traffic fatalities decreasing with complete automation.
This comes down to asking "Can a cockroach do a better job than the worst human drivers?" And will the lives saved by doing better than the worst human drivers compensate for all the people who die at the hands of self-driving cockroach-cars, because the good human drivers are no longer in control of their cars?

I honestly don't know the answer to that one.

But why, exactly, are we so thrilled about the possibility that our vehicles might soon all be controlled by a cockroach-level intelligence that is just barely more functional than a drop-dead drunk human being?

By the way, I think your comments about natural selection are dead-on. We humans have been collectively subjected to less and less danger for some centuries now, and as a result, have had less and less unavoidable reasons to sharpen our wits. The village idiot no longer gets eaten by the lion he tried to pet, removing his genes from the pool.

In recent decades, I think something else is going on: large-scale arrested development, a psychological condition in which human bodies mature to adulthood, but their brains remain child-like, arrested at an earlier, pre-adult stage of development. Look around you, there are symptoms of this everywhere you look. (One of my early clues, some twenty-plus years ago, was seeing adult women paying for and proudly wearing teddy-bear shaped backpacks, something that would be entirely normal for a seven-year old child fifty years ago, but which is quite disturbing in a 25-year old adult.)

-Gnobuddy
 
Enzo 2/27/2018 2:33 PM
If you flip a beetle on its back, it kicks its legs in the air until it dies. Not because it is stupid, even though it is, but because it lacks the body parts to upright itself. Now flip a driverless car on its back - it will lie there spinning its wheels until it runs out of fuel. (Or some safety inversion detector shuts it off). But wait, put a driver car on its roof, and it too is unable to right itself. What have we learned? That the example is not instructive.
 
Gnobuddy 2/27/2018 2:54 PM
Quote Originally Posted by Enzo View Post
My brain is remembering 70 years of stuff, the auto-car only needs to remember the program in its ROM.
Remember, your brain is an estimated hundred thousand times smarter than the self-driving car's computer(s). If you were somehow using 90% of your brain on tasks other than driving (which would make you the world's most unfocussed driver!), the remaining 10% of your intelligence is still ten thousand times smarter than the car.

I agree with your point that the car software is tuned for driving, and only driving. But, starting out a hundred thousand times stupider than you, this alone won't make the car anywhere near as capable as you are, if the situation gets complex and requires actual intelligence.

Quote Originally Posted by Enzo View Post
I don't think comparing the car to the brain is a valid comparison. After all your cell phone has way more computing power than the computer that flew crews to the moon.
This is a classic case of how human abilities blind us to reality. Which is harder, computing a path to the moon, or being able to tell a mail-box from a trash can?

To a human brain, there's no contest. It takes years of study to get to the point where you understand the second-order differential equation that describes motion through space, and even then it takes a smart person to do it. But any dummy can tell a mailbox from a trash can!

The thing is, computationally, the difficulty of these two problems is exactly the opposite. A second-order differential equation is actually computationally easy - you can give any reasonably smart person a copy of Numerical Recipes ( https://en.wikipedia.org/wiki/Numerical_Recipes ), an introductory book on computer programming, and Newton's laws in differential equation form, and she will, within a week or two, be able to write a program to predict the path of a lunar launch vehicle between earth and moon.

But writing a program to tell the difference between a mailbox and a trash can? That's computationally impossibly difficult. Google's super-smart AI systems, running on tens of thousands of PCs clustered together into a super computer, still can't do it with any reliability.

This is why a very stupid computer (programmed by very smart human beings) got us to the moon and back. But that doesn't mean that same computer can tell a road from a tree-trunk, or a mailbox from a pedestrian.

There are aspects of driving that require very little intelligence. Like cruise control - we've had mechanical cruise-control for decades. It follows a super-simple algorithm:

Code:
       if (wheel rpm < desired value)
          increase throttle
       else
          decrease throttle
With a microprocessor, it is pretty straightforward to extend that crude cruise control algorithm to maintain a safe distance from the car in front:

Code:
       if ((wheel rpm < desired value) AND (following distance > minimum safe distance))
          increase throttle
       else
          decrease throttle
And you could again easily extend that to apply brakes if the closing velocity exceeds some safe threshold, for example.

All this is pretty simple (even when you include some Newtonian mechanics so the cruise control has a better understanding of how hard to apply the gas or not, etc).

But telling a roadway from a plowed field? Telling a parked car from a moving one? Detecting one car among the dozens in the field of view? Recognizing the presence of a pedestrian? All these are almost impossibly hard computational programs. A cockroach probably couldn't successfully do any of them. A targeted AI program, as we have seen, can do a little better than a cockroach. But only a little: a bright sky is (was?) no different than a white truck to Tesla's oh-so-smart AI software...

Cockroaches don't get drunk, neither do self-driving cars. Cockroaches probably don't get enraged, neither do self-driving cars. Yes, they do have those things in common! But is that adequate qualification for putting the cockroach in charge of your life - and the lives of those around you?

Computer code is fundamentally stupid. That's not about to change.

And for those who have already forgotten: bad code in Toyota's car computers has already cost many lives:
1) https://www.cbsnews.com/news/toyota-...has-killed-89/
2) Toyota recall: Last words of father before he and his family died in Lexus crash | Daily Mail Online
3) Toyota Recall 'My Car Just Wouldn't Stop' | PEOPLE.com
4) Toyota Settles Over Death of Family in High-Speed Crash - The New York Times

-Gnobuddy
 
Chuck H 2/27/2018 3:19 PM
FWIW I never said that cockroaches don't run into each other (or crawl on top of each other). I said they know how not to. I'm sure if bumping into or crawling on top of one another were a problem for cockroaches they wouldn't do it!?! Besides... Who said we're going to program these cars to behave like cockroaches. I would think that an sedan sized cock roach would be bad for any driver (and I'm not considering traffic safety )
 
Enzo 2/27/2018 5:22 PM
I didn't intend to bog down on details, I just find the cockroach argument to be simplistic - it wants to put all its begs in one ask-it.

As a human I may have more brain power, but I also have to use some of it to FOCUS on my driving. The car has no choice but to be totally focused and at all times. We can always find examples of multimillion line code having some loophole or glitch that was unforeseen, some combination of circumstances that become lethal. Just as a wet highway that is safe to drive on drops half a degree in one area and freezes, and the unaware driver now is spinning on ice he never suspected would be there. That is why they put those "bridge freezes before roadway" or similar signs up. How many of us actually slow down whenever we see those signs? Focus.

Wanna bet that Tesla if faced with the same white truck today would have a different response? That event was a long time ago in the world of AI development. Widen the field of view some and the outline of the truck resolves.

Realistic complaints from me? I see cross traffic, and I always worry if he is going to run the red light. If he is 10 yards from the corner, pretty easy to tell. But I drive country roads, The guy is 100 yards or 200 yards from the stop sign, Does he look like he intends to stop? Does an auto car look that far away? A "parked car" on the side, I see the driver looking back over his shoulder and the brake lights go off. I immediately think, he might pull into traffic. I doubt the car can anticipate that without some motion. I move over preemptively.

"90% of my brain" lumps the brain into one big smart pool, but the brain isn't organized that simply. The car never worries that I will get there before Aunt Ethel has to leave for the airport. The car is not arguing with its wife over how I took too long in the bathroom. The car is not still pissed off about the scolding the boss gave it during the day.

All I am saying is that the numbers do not tell the story. IS there need for a ton more develpoment? Of course. To me saying the car is dumber than a cockroach is like saying a Fender is louder than a sandwich.
 
Chuck H 2/27/2018 9:14 PM
Ok... This is really good. Enzo's post made me think of this (probably obvious, but I'm about to say it out loud ). Cars, as driven by people and the roadways and traffic rules we use to navigate them have been products of evolution toward PEOPLE THAT DRIVE CARS! In this light I'm not sure it makes sense to try and design any automated system to do it in any similar way. Unfortunately the existing system is, to some degree, setting the stage for the working format. In other words, rather than trying to design self driving cars to do what people do, we might have better success redesigning the transportation method to accommodate what computers do.
 
bob p 2/28/2018 12:42 AM
[QUOTE=Gnobuddy;481321]
Have you ever seen a cockroach or beetle that was accidentally flipped over onto its back? Beetles die that way, of exhaustion and starvation, because they are too stupid to do anything more than wave their legs helplessly in the air for hours on end. With no comprehension of the world around them, they can't adapt to an unexpected situation.
I'm not sure that the Beetles' problem is that they're too stupid to flip over, I think that the problem is that they're just not designed to be capable of flipping themselves over. There are species of snails that have the same problem -- they're not able to flip themselves over, no matter how hard they try. OTOH, there are many species of snails that are able to flip themselves over. I think it's more related to the species' evolutionary design that goes along with the niche that they ended up filling in the environment. Realistically speaking, you can't really compare the intellect of one invertebrate to another... when they're both attempting to self-right, it doesn't matter that one succeeds when the other does not. It's not as if one is smarter than the other -- they're still both invertebrates.

Why would we expect a cockroach-stupid self driving car to do any better?
I'm missing out on why we'd even expect a self-driving car to try right itself if it ends up on it's back. If it ends up on it's back, then it failed at what it was doing and I'd prefer that it not try to do anything else.
 
bob p 2/28/2018 1:32 AM
.
.
I'm tired of typing self-driving car and autonomous vehicle. I'm going to use SDC from now on.
.
.

Quote Originally Posted by Enzo View Post
Why do they need to be controlled globally from a central facility? The driven cars of today certainly function and have no such control. If I am driving to the mall, and you are driving home from work, we have different goals and intentions, yet a set of rules allows us all to function together.
There doesn't need to be a centralized facility, there just needs to be centralized control. There are several reasons that coordinated synchronous control will always outperform autonomous asynchronous control. I'm sure that you can think of a few if you think hard enough. The most basic example of coordinated synchronous control involves training drivers to stay on the right. That's what allows you to drive to the mall without hitting me while I drive to work. Now extrapolate from that.

I don't think that two SDC headed in opposite directions have as much need to communicate (at least not until one realizes that it's left it's lane, then it had better broadcast an alert signal to the other SDC on the road.) But the world is a more complex place than the scenario of you driving to the mall while I'm driving home from work.

Part of the problem is that the people who are designing SDC are operating from a really stupid design perspective. They imagine that an SDC will become a viable mode of transportation with a computer that's designed to be accident avoidant as it drives around like a half-blind grandmother. The problem is that when grandmothers hit the road, they don't drive like the Little Old Lady from Pasadena, they tend to act like a rolling chicane. That's what you'll get with SDC that are designed to operate at a safe distance from one another when you drop them into a hostile operating environment. The harsh reality is that the SDC will only have the luxury of driving like a grandma in rural locations. You simply can't drive that way in a big city. If you get on the Dan Ryan Expressway in Chicago and you drive like that, chances are that you'll get several bullets in your head because you've pissed off everyone around you.

Imagine a SDC is going to get on the Dan Ryan Expressway in Chicago during rush hour, where people drive 90 miles an hour while hanging onto each others' bumper. People around here drive their cars like they're jet fighter pilots flying in formation. It's a competitive environment. Intensely competitive. SDC aren't designed to be competitive. They're designed to be safe. When an SDC encounters a "crazy" competitor, it's going to slow down to avoid an accident. Imagine that happening in an environment where everyone is intent on getting somewhere fast. A SDC that refuses to keep the pace will become an obstacle, and a SDC that uses autonomous control to try to keep the pace won't be as safe as one that communicates with other cars on the network and synchronizes it's behavior with those of it's neighbors. By definition, SDC are going to have to be aware of other cars on the road and work together, otherwise the paradigm just won't work.

When people drive along at 90 bumper to bumper, they're not looking at the bumper of the other car to gauge distance, as an SDC does. They're looking at the driver in front of them, not his car, to determine when a braking event or a lane change might happen. (People don't use turn signals here, so you have to watch the driver to know if he's going to change lanes. An SDC looking for a blinking light just isn't going to fare well. ) And Chicago drivers aren't just looking at the driver in the car ahead of them -- they're looking through the car ahead of them, to see the next driver in the car ahead of the car in front of them, to determine when they may have to brake. Because reaction time matters.

In the real world you can't rely on the visual interpreter that's connected to a camera that's covered in road dirt. In the real world you have to account for latency in braking, and latency is exactly what you'll get when you wait for a SDC to figure out that the car in front of it has already started slowing down. To avoid accidents you'll need to eliminate braking latency, and that means one car will have to broadcast it's intent-to-brake information to the cars around it as it engages the brakes. Then the cars behind around it will have to listen for the signal and act accordingly.

Do SDC have to be coordinated by a centrallized computer somewhere else? No. But do they need to work together in a dense traffic situation? Absolutely.

Imagine what would happen if an SDC that was trained to drive across rural Canada on Highway 1 was dropped into Chicago traffic. Not liking the traffic density, it slows to try to increase the space between it and the car in front of it to a safe distance. That's going to create a traffic jam, not to mention annoying the hell out of every driver behind him. People in the lane to his sides will immediately recognize that a space in the roadway has been opened and they will guttersnipe into the space that he's just created, because guttersniping gets you ahead of where you were before. So now the SDC sees another obstacle that's too close and further reduces it's speed. Then another driver guttersnipes into the space and causes the SDC to repeat it's behavior. The SDC ends up causing a menace to navigation as it keeps braking slower and slower and slower, continuously reducing it's speed until it eventually stops in the middle of the Expressway and creates a traffic jam. In Chicago, chances are that someone will set it on fire.

In those situations where optical sensors are dirty, something better is required. In those cases where traffic is dense, something better is required. Granted, today's SDC aren't even trying to be designed to handle the Dan Ryan commute, there's just no way they could succeed. They'd continually be trying to operate safely in an intensely competitive environment that they weren't designed for. It's obvious to me that car-to-car communication is going to be helpful, and synchronized control is going to be even more helpful. Driverless systems that work in dense urban environments are going to have to have better guidance systems than cars that mozy along Route 66 in the middle of Arizona.


Numerical Recipes used to be a free download. Sadly, it looks like they now charge for the downloadable code libraries. Glad I've got mine.
 
bob p 2/28/2018 1:42 AM
IMO this SDC business has nothing to do with the intellect of an SDC. It has more to do with the competitive and predatory behavior of human drivers.

Quote Originally Posted by Chuck H View Post
Ok... This is really good. Enzo's post made me think of this (probably obvious, but I'm about to say it out loud ). Cars, as driven by people and the roadways and traffic rules we use to navigate them have been products of evolution toward PEOPLE THAT DRIVE CARS! In this light I'm not sure it makes sense to try and design any automated system to do it in any similar way. Unfortunately the existing system is, to some degree, setting the stage for the working format. In other words, rather than trying to design self driving cars to do what people do, we might have better success redesigning the transportation method to accommodate what computers do.
There's no doubt that redesigning the transportation system will have to happen. The problem is that you have the situation where people drive cars, and SDC aren't all that good in that situation, and at the other extreme there's the alternate paradigm where the roadways are all designed for SDC operation, where SDC will do quite well.

It's the situation where human drivers with unpredictable behavior that can be changed at will are mixed with SDC that have predictable / unchangeable behavior that will cause a problem. Proponents of the SDC are naive if they don't believe that human drivers will learn how SDC behave, and then use that knowledge against SDC obstacles on the roadway. They will use that knowledge to one-up the SDC in traffic. I know I will. I know people in Chicago will do it too. The end result is going to be human drivers marginalizing SDC on the side of the roadway like a pack of wolves hunting down a sheep. That won't stop until people are no longer behind the wheel.
 
Gnobuddy 2/28/2018 9:16 AM
Quote Originally Posted by Enzo View Post
The car has no choice but to be totally focused and at all times.
I think you have to have awareness to be focused. Without awareness, self-driving cars can have no focus.

There can be no distractions, either - that also requires focus.

But we are still operating on the assumption that a stupid thing with absolutely no understanding of the world is going to be a better driver than a human being - does that not seem more than a little odd? Would we give the same credibility to someone who proposed teaching grasshoppers to drive our cars? Or field-mice, which are thousands of times smarter than our best computer CPUs?

To me the big "ask" is being asked to believe that an insect-level intelligence can successfully take on the complex ethical and social issues that surround driving. Life and death, pain and suffering, financial loss, our legal system - these are weighty things that all human drivers are expected to consider when they drive (which is why most of us don't just ram the jerk who's cut you off for the fifth time). A dumb AI system trained by repetition has no understanding of any of these concepts; killing a human is no different than running into a lamp-post, and the vehicle understands neither "human" nor "lamp-post" nor "kill" nor "running into".

I think we have perhaps collectively grown up watching too many melodramatic science fiction TV shows, in which the robots are not only exceedingly capable, but usually more capable than the hapless humans around it. As a result, we are perhaps unable to see how incredibly limited today's crop of robots are (even though they are vast improvements over the even less capable machines of yesteryear.)

At any rate, I think this thread has reached the point of diminishing returns. Nobody has changed their previous opinions, and nobody seems likely to, so we're all now just re-iterating our thoughts and beliefs on the subject. Humanity does spend a lot of time doing that, but it's not particularly constructive time, so I think I'll go do something else.

-Gnobuddy
 
bob p 2/28/2018 11:04 AM
People who want to indulge in sci-fi fantasies are going to indulge in sci-fi fantasies. It's what they do.
 
Chuck H 2/28/2018 11:13 AM
Quote Originally Posted by Gnobuddy View Post
I think you have to have awareness to be focused. Without awareness, self-driving cars can have no focus.

There can be no distractions, either - that also requires focus.
Semantics. This statement is designed specifically to be contrary without qualifying itself. Humans don't follow programs when they drive and computers don't have to focus when they drive. So, when discussing an analog between the two the words "focus" and "program" are usefully close in meaning as they apply to humans and computers respectively. Please don't confound debates or premise your position with trite nitpicking like this. It dilutes the quality of your arguments.
 
Gnobuddy 2/28/2018 1:15 PM
Quote Originally Posted by Chuck H View Post
Semantics. This statement is designed specifically to be contrary without qualifying itself.
Absolutely not. Enzo used the word focus, I pointed out that you cannot have focus without awareness. What's the beef with that?

Quote Originally Posted by Chuck H View Post
Humans don't follow programs when they drive
We don't follow line-by-line instructions like a computer, but we do have patterns of neural connections that we've created as we were taught to drive, and strengthened as we continued to drive. Our brains mostly follow those pre-wired "programs", which is why an experienced driver can drive almost without mental effort - until some sort of unexpected situation comes up. We may be chatting with our passengers as we drive, but when we see the orange cones and the waving warning flag up ahead, or a string of red tail-lights, or a tractor-trailor big rig wiggling from side to side on the brink of jack-knifing, we stop talking and put more thought into the driving process.

Quote Originally Posted by Chuck H View Post
Please don't confound debates or premise your position with trite nitpicking like this. It dilutes the quality of your arguments.
I'm sorry you see it that way; I don't. My point all along has been that self-driving cars are too STUPID for any human analogy to apply to them. That includes the concept of "focus". And that's what I said in my post.

-Gnobuddy
 
Chuck H 2/28/2018 2:30 PM
Quote Originally Posted by Gnobuddy View Post
Absolutely not. Enzo used the word focus, I pointed out that you cannot have focus without awareness. What's the beef with that?
My beef is that Enzo used it (appropriately) as an analogy to indicate that computers are not distracted. Calling him out on the specifics of the word borders on being a grammar douche. But to play it your way, computers don't focus because they aren't capable of distraction. In fact your outline of the word clearly illustrates that humans focus because they must in order to achieve. Awareness being the only reason for the NEED to focus. Ergo, no awareness, no focus. And you based a long post on this to point out that computers/cockroaches are dumb. Even your comparison between computers and cockroaches is obtuse to the issue because you can't program a cockroach. Suggesting that computers can't drive because they're not as smart as people is also obtuse. Your arguments of stupidity, cockroaches and the superior human intellect would then imply that humans must be better at everything than computers and computers aren't better at anything than cockroaches. I think you see where this is going. Then again, maybe not, since you were unable to process Enzo's appropriate analogy using the word "focus" to indicate a computers inability to be distracted when the previous sentence clearly prefaced that the word was an analogy.

The roads and traffic laws are designed around humans driving cars. There's a lot to account for there. I don't know if driverless cars are possible. But if not, it won't be because cockroaches can't drive. It's not even because computers aren't aware or lack processor power. It'll be because an appropriate program can't be created by people trying to hammer a square peg into a round hole.
 
bob p 2/28/2018 4:42 PM
You guys are starting to sound like me and Enzo, lol. For once, I can be glad that it's someone else.

FWIW, I agree with both of you. Semantics and analogies tend to be a necessary component of a discussion like this when you attempt to use personnification to attribute decisive qualities to non-people. Suffiice it to say that I don't think either one of you is wrong.

FWIW, I also think that it is possible for computers to lose focus and become distracted. In bit nerd parlance it's called a wait-state. In the worst case scenario, you get a lockup. Hopefully your SDC will never display the Blue Screen of Death immediately before it drives you into oncoming traffic because it's waiting for an image update to come from the mud-covered optical sensor...

[IMG]http://music-electronics-forum.com/attachment.php?attachmentid=47355&amp;d=1519861422[/IMG]

On the subject of not being able to program a cockroach, what many people refer to as "programming" does occur in animals. In animals "programming" is achieved through a combination of classical conditioning and operant conditioning. Pavlov showed us a long time ago that you can program a dog to salivate by ringing a bell (classical), just like you can program a person to step on the brake when they see a brake light ahead of them (operant), typically acting first and doing the conscious decision making after the initial reflex takes place.

What's odd about this is that neither one of you like the idea of SDC. You have the same outlook, but you're disputing one another anyway.
 
Gnobuddy 2/28/2018 4:49 PM
Quote Originally Posted by Chuck H View Post
Your arguments ... then imply that humans must be better at everything than computers
Computers can add, subtract, and store numbers a lot faster than people can. That's the one thing they do better than people, and billions of times faster.

So, given appropriate programming, computers can be better than humans at a task that requires only addition, subtraction, and storage of numbers.

People have managed to program computers to add, subtract, and store numbers in ways that are useful to us - to represent letters on a screen, to calculate our pay check, to simulate the atmospheric conditions over North America on a given day, to calculate the airflow around a rocket in flight. The computer adds, subtracts, and stores numbers; the intelligence always came from humans, never from the stupid computers.

One of the still unsolved problems in AI is recognizing, for example, one bolt in a tray full of identical bolts. A three year old human child can be taught to do this, and so can a chimpanzee, or a dog. But computers, crunching (literally) billions of numbers per second, are absolutely terrible at this task.

The child, dog, or chimp has the ability to understand that one bolt is one "thing", and a pile of them is a collection of many similar "things". Cockroach-stupid computers, so far, are incapable of doing this. So are real cockroaches. The task requires a level of intelligence beyond what a cockroach - or a computer - has.

A computer doesn't know what a bolt is. It doesn't know what a road is. It doesn't know what a car is. It doesn't know what life is. It doesn't know what death is. It has no clue what pain means, or ethics, or responsibility. Why are we even considering putting such a thing in charge of a deadly weapon (a moving vehicle with enough mass and velocity to kill)?

Semi-autonomous cars will surely happen. Cruise control? It's been here for decades. Traction control? Most of today's supercars would be undriveable without it. Cruise control with safe-distance following? It'll probably work much of the time, but will probably fail now and then (like Tesla, the white truck, and the bright sky). Cruise control with lane-following? Ditto; there are times when human intelligence can barely tell where the lane edges are (after a snowfall, for example), and the cockroach-car certainly won't do any better.

But the supposed fully autonomous car that will drive you safely through urban streets and residential neighbourhoods while you play "flappy bird" on your shiny overpriced fondle-slab-phone-thing? I think that's a ridiculous, unrealistic dream that won't be coming true any time in the near future.

Quote Originally Posted by Chuck H View Post
...and computers aren't better at anything than cockroaches.
I never said "anything"; computers certainly add and subtract numbers faster than cockroaches, though they have no more awareness of what they're doing than the cockroach does.

Let's look at this self-driving-car thing one more time, from a slightly different perspective. Of all the routine human activities that large numbers of people participate in daily, driving is the one that carries the most risk to other people, and other people's property. As far as society at large is concerned, driving is probably the single largest ethical and legal responsibility that most people carry. A car is a lethal weapon, and if you do not operate it according to society's rules, you can be charged with use of a deadly weapon.

This is why we don't let children drive (and they are a hundred thousand times smarter than a self-driving car). We don't let people with lowered mental acuity drive. We don't let people drive if their ability to function is impaired due to alcohol, or various diseases, or various medical drugs. We don't let trained chimpanzees drive (and studies have suggested they are better drivers than humans, with faster reflexes and better eyesight.)

So why are we even considering handing over such a loaded and complex responsibility to something with the intelligence of a cockroach? On the face of it, it makes no sense at all.

As to the supposed intelligence of computers (and with no intention of malice towards you at all), I've gotta ask: I wonder if you've ever written a computer program, Chuck? If not, you may not be aware just how incredibly stupid computers actually are. The art of computer programming mostly consists of learning how to make a very fast - but incredibly stupid - tool actually accomplish anything useful.

Anyhew - as I said before, I think the goodness has gone out of this thread, and I'm certainly not here to stir up anger or annoyance, so I'd rather divert my energy to other, more positive, threads.

Have a nice day, everyone (and you particularly, Chuck!) And I mean that quite sincerely!

-Gnobuddy
 
Bloomfield 2/28/2018 5:09 PM
I've been following this thread for a while, but being painfully slow at typing I haven't weighed in. I think the important question outside of whether the technology is capable, is who asked for self-driving cars in the first place? It seems to have been forced on us by the 'tech' companies while our governments have stood aside and allowed this to move closer and closer to to becoming a certainty. Contrary to what some might have us believe, it IS the role of governments to regulate industry and to keep citizens safe. Governments have been shown to be far too late to respond to technological change in vehicle safety. As an example, cell phone use while driving was allowed to become an ingrained habit long before laws prohibiting it were passed, and still there is so little attention paid to distracted driving of this sort which is in reality a far bigger problem than impaired driving. Why are you allowed to put a GPS display in the middle of your windshield? And of course all the new cars have giant touch screens in the middle of the dashboard, so that instead of reaching over to turn down the heat, you now have to take your eyes off the road and focus on a screen. This could have been stopped before it started, but for the manufacturers it is cheaper to produce, and provides another flashy geegaw to boost sales.

Self-driving cars are being pushed on us under the guise of being safer. Whether or not they are is yet to be determined, but there are any number of other ways that we could make roads safer without committing ourselves to this route. I have definitely noticed that driving standards have declined steadily in recent years. In addition to problems like texting while driving, new cars (and pickups especially) have been getting bigger and bigger and more powerful, while simultaneously having worse and worse outward visibility. Maybe the key to safer roads lies not in self-driving cars, but in better cars, better driver training and testing, and most of all fewer cars. Instead of investing all this money in self-driving car technology, we could be improving public transport and creating incentives to put freight transport back on the railways where it belongs.

I'm curious to see where all this leads, but I suspect we will have very little say in the matter. I have no doubt that self driving cars can work, but like many of you I am not convinced that they can work in concert with human-driven vehicles, and it seems to be heading in the direction of humans eventually being pushed out. As for myself I like driving and consider myself to be quite good at it. I grew up as a car nut, but the cars and trucks that interest me are getting older and older. I can't think of a single new vehicle in the last ten years that I'd want anything to do with. As we have seen, if people are offered bad choices they tend to go for them, so unless we demand better choices were kind of stuck.

What any of that has to do with not being able to see black, I don't know. I just think we might need to ask different questions.

Andy
 
Enzo 2/28/2018 8:25 PM
So far they haven't discussed requiring us to get them. Many people WANT them. If you count the number of people driving their cars today who are looking at their phones or even reading a book or newspaper sitting on their steering wheel, I'd see plenty of potential buyers.

I think the intent of the industry - and by intent I mean where they think the market will go - is that the idea of car ownership and driving everywhere will change. Millennials are not interested in owning cars already. I think they foresee a grand Uber-like thing going on, where pools of cars will arrive by summoning them, and take you to your destination and leave you there. Or people sharing ownership. people pay for their car by use rather than full ownership.

Without going all through it again, I just don't buy the roach comparison. As to the bolts in a tray, computer controlled robots routinely pick out bolts from random bulk stock in assembly machines. In all this AI and related stuff, something that was a problem a year ago is past history now. Even if AI doesn;t learn well, the people creating it do.
 
Chuck H 2/28/2018 8:48 PM
Bloomfield, I'm wrinkling up an extra foil hat for you (I'm already wearing mine).

Yes the self driving cars are an inevitability. That's if it comes to pass before we exterminate ourselves in some other way. Of course we won't have humans and computers driving on the same lanes like we are attempting now!!! It's pretty clear THAT isn't working out. How long it takes the people involved in the creative process to recognize this is the only holdup to forward progress. People do what people do. Computers do what computers do. People use computers for what they do. Attempting to integrate them in an AI capacity WRT an activity with safety caveats is a good source for grants, but probably not the answer. In the end I predict the solutions will involve some separation between humans and computers rather than integration. Though integration seems to be the focus of many new technologies. The bottom line is that you can't take a system that has evolved for over a hundred years in a uniquely human way for human use and arbitrarily apply a peripheral human development from the last twenty years to it. Both systems need to be integrated with each other.
 
Bloomfield 2/28/2018 9:24 PM
Jeez I didn't mean to sound all tinfoil or anything. You're right though, a lot depends on how the two are integrated, and that depends on a lot of things that aren't related to computer engineering; things like urban planning, highway engineering, etc. that are all a part of the equation that maybe aren't all being thought of together. Certainly on major highways you could have that separation, but in rural and remote areas there will necessarily be interaction between the two. Of course there is likely less traffic involved.

I like the idea of this robot that can sort out bins of random hardware; I could go for that. Does it do resistors? A friend gave me a big box recently. I spent a couple hours and made a dent. If anyone needs 12M, 15M and 18M 2W, you know where to look. Send a self-addressed stamped envelope,

Andy
 
bob p 2/28/2018 9:47 PM
Great post, Andy.

I'll preface what I have to say by paraphrasing Jay Leno: "Cars have changed more between 1994 and today than they changed between 1914 and 1994."

Quote Originally Posted by Bloomfield View Post
I've been following this thread for a while, but being painfully slow at typing I haven't weighed in. I think the important question outside of whether the technology is capable, is who asked for self-driving cars in the first place? It seems to have been forced on us by the 'tech' companies while our governments have stood aside and allowed this to move closer and closer to to becoming a certainty.
I have a friend who's been working at one of the major tech companies that's working on SDC, ever since that company was a start-up. The reason for the push for SDC by the tech industry is simple -- greed. They see the automobile market as their next area for growth, and they want in on it. It's not enough for them to sell you the parts in your computer. Now they have their target set on selling you the hardware and software that goes into every part of your life: your PC, your phone, your refrigerator, your thermostat, your watch and your car.

The CPU companies figured out a long time ago that the race for processor speed according to Moore's law was a dead end. Why? Because faster clock speeds require higher voltage, and power requirements go up with the square of the voltage. Every incremental increase in voltage resulted in a squared increase in power.
Computers were getting so fast that the thermal design profile was becoming the problem. Heat put an effective limit on how fast a PC could become. So instead of working on clock speed, CPU manufacturers started working on multi-core processors.

The PC isn't particularly a growth industry any more. Everyone in Silicon Valley knows that. That's why companies like nVidia have spent so much R&D on low voltage, low power processor development over the past 10 years. They saw the light that there was no real future in high end graphics computing for PC gaming, so they diversified into low-voltage GUI chips, and targeted the automotive industry and machine learning.

Why did the automotive industry become interested in high-tech solutions for simple HVAC knobs and sliders? Because the tech companies offered them integrated systems that eliminiated some of their major liabilities. Like the keyless ignition that's radio controlled. Ever wonder why keys got eliminated and all cars have keyless ignitions now? It was because of the wrongful death lawsuits related to failing ignition switches. Instead of making the ignition switches safer (which was impossible becuase of the wad of keys that people would hang off of them) the industry just decided to get rid of them altogether. Now you have the keyless ignition.

Once the chip companies got the car makers to agree to use their chips, it was only a matter of time until cars received ridiculous GUI displays. It used to be that you could change the channel on your radio, or adjust the heat by feeling a knob or a slider without ever taking your eyes off of the road. You can't do that anymore. Now everything has to be done via a computer interface.

For people like you and me this isn't good. We have to be inattentive to the road in order to serve the needs of a GUI display just to adjust the heat or the radio, at a time when it's illegal to have a TV screen in the front seat of your car. To me it makes no sense that the car would be allowed to have a GUI when a TV screen is illegal.

To answer the question of why we have all of this computerized crap in cars today -- it's because the electronics companies can make money -- lots of money -- by selling a solution to the automakers that lowers the liability of building their products. The electronics people got their foot in the door with the electronic ignition and electronic computer engine monitoring, Very soon you'll see door mounted mirrors do the way of the dodo. They'll be replaced with cameras and TV screens because eliminating the mirrors will add 0.1 MPG to the vehicle's fuel efficiency (nvidia statistic). Me? I'd rather have a rear view mirror than an expensive video system that's going to break and require an expensive repair. I don't want any of this stuff. That's why I drive old cars.

Contrary to what some might have us believe, it IS the role of governments to regulate industry and to keep citizens safe. Governments have been shown to be far too late to respond to technological change in vehicle safety. As an example, cell phone use while driving was allowed to become an ingrained habit long before laws prohibiting it were passed, and still there is so little attention paid to distracted driving of this sort which is in reality a far bigger problem than impaired driving.
I have to admit, I don't understand why we pass ineffective laws that tell people not to text and drive, without any means of enforcement. Instead of passing a law that we know will be ignored, why don't we mandate that auto manufacturers build short-range cell phone jammers into their cars so that any cell phone signal within the car is jammed when the transmission is engaged? If we're at that point where we're willing to turn authority for our lives over to car-based computers, then why not let the car-based computer enforce the no-texting laws?

I guess that's never going to happen -- Verizon, AT&T and T-Mobile would have a shit fit.


Self-driving cars are being pushed on us under the guise of being safer. Whether or not they are is yet to be determined, but there are any number of other ways that we could make roads safer without committing ourselves to this route.
It amazes me that people are so willing to accept them, based solely upon the faith that someday people will be able to make them work the way that people today imagine they should be able to work. I find blind faith to be dangerous.

It was raining tonight when I came home from shopping today. My tin foil hat doesn't keep me very warm when it's snowing and cold, but it sure works GREAT when it's warm and raining.
 
bob p 2/28/2018 10:00 PM
Quote Originally Posted by Enzo View Post
Millennials are not interested in owning cars already. I think they foresee a grand Uber-like thing going on, where pools of cars will arrive by summoning them, and take you to your destination and leave you there. Or people sharing ownership. people pay for their car by use rather than full ownership.
Tin foil hat rant:

Millennials are an interesting crowd.

They've been brainwashed in school that they should not worry about being competitive, and they should not worry about learning a valuable skill that will result in a high-paying job. Instead, they've been led to believe that they should do whatever they enjoy doing in life, and not worry about money, because that will take care of itself. That's a horrible lie. Unfortunately it's going to adversely effect the life of anyone who believes it.

They've been raised to value experiences, rather than physical goods, at a time when the standard of living in America is in a continuous gradual decline. They don't want to own homes, which is good because they can't afford them. They don't want to own material possessions, which is also good, because their standard of living will be lower than that of the generation that preceded them. And they don't want cars either. Again, that's good, because they won't be able to afford outright ownership. Cars are becoming so expensive that partial ownership and ride sharing is likely to be their only option.

The good news, I guess, is that since they've been trained from childhood not to want these things, they won't miss those things that they'll never be able to have.
 
Chuck H 3/1/2018 6:42 AM
Tin foil indeed! Current and future generations are being "programmed" (not focused ) to be autonomous, expendable pieces of the whole. Consider other cultures that already exhibit this trait and you'll see where America is going. So the people failing to create SDC's that work with our existing infrastructure can rest easy. Having millennials behind the wheel is the next best thing There will be fleets of little gray smart cars and parking kiosks where they are picked up and dropped off. Like bicycles in Denmark. You just go get a "car" (or maybe we'll spell it "kar" because, you know, cool!). You drive it to the kiosk nearest your destination and walk or take public transit from there. Perhaps these kar's will be autopilot for freeway use. You pull onto the ramp and the kar goes into this mode while Mr. Millen, the driver, works on the draft he's presenting in 45 minutes. Then at the exit ramp he just takes control and drives the short distance to drop off the kar and walks to the nearby Happy Bagel for breakfast on his way to the job. It's a good job, but it has a lousy insurance plan. When you get sick they just grind you up and make you into those little biscuits they serve in the cafeteria.
 
bob p 3/1/2018 7:17 AM
You old fart.

[IMG]http://music-electronics-forum.com/attachment.php?attachmentid=47360&amp;d=1519913820[/IMG]
 
bob p 3/13/2018 8:50 AM
Now this is interesting -- GM is reportedly going to re-invent itself as a manufacturer of cars for it's own ride sharing company.

GM Plans to Launch AirBNB for Your Car

Maybe GM realizes that their cars are becoming so expensive that in the future most people won't be able to afford anything more than a time-share.
 
Steve A. 3/13/2018 10:44 AM
Quote Originally Posted by bob p View Post
Now this is interesting -- GM is reportedly going to re-invent itself as a manufacturer of cars for it's own ride sharing company.

GM Plans to Launch AirBNB for Your Car

Maybe GM realizes that their cars are becoming so expensive that in the future most people won't be able to afford anything more than a time-share.
With new cars losing ~25% of their value the minute you drive away from the dealer I have always bought used cars. Perhaps Detroit needs to build used cars to increase their sales!

I bought a 1988 MBZ 300E for $3000 in January 2014 which now has 200k miles on it. I've been fortunate in having repairs done by a friend and the local junior college so I've only had to put about $2000 in it for repairs including new tires and suspension. So excluding gas, insurance and registration that works out to $100/month. And whenever the car reachs the point where it is not cost-effective to repair I can turn it in for $1500 in the Cash for Cars program in California to get old jumpers off the road. It only gets about 18 mpg but I don't drive that far so it isn't a problem for me. Smoothest ride I've ever had!

Steve A.
 
J M Fahey 3/13/2018 1:41 PM
Quote Originally Posted by bob p View Post
You old fart.

[IMG]http://music-electronics-forum.com/attachment.php?attachmentid=47360&d=1519913820[/IMG]
Hey, at least that way old farts are good for *something*

Wonder what flavour do they add, though.
 
Enzo 3/13/2018 2:22 PM
That reminds me. Around here we get "Girl Scout Cookies" every year. The Girl Scouts sell the cookies to raise funds. I like the Samoas, and the Thin Mints. Here at the home, various residents send their granddaughter scouts around to try to sell them.

Samoas:
[ATTACH=CONFIG]47567[/ATTACH]

So recently a young girl in her uniform came to my door and asked if I would like to buy some Girl Scout Cookies. I said certainly I would. She asked me what flavor, and I said "Oh, I thought that WAS the flavor."

And then the trouble started...
 
Chuck H 3/13/2018 7:24 PM
I like the Do Si Do's (like Nutter Butters, except gooder). I eat them by the sleeve like a brush chipper.

Don't worry about the faux pas Enzo. I made the same mistake once with Baby Oil.
 
nosaj 3/13/2018 7:28 PM
Quote Originally Posted by eschertron View Post
Since this thread's taken us deep in the weeds off the side of the road
I sometimes stop by the freezer section of the local supermarket just to spy the tubs of chitterlings. Nobody's ever heard of chitterlings, but many folks would recognize 'chitlins' by name, if not by sight.
Or smell if someone's cooking them, they smell like A$$.

nosaj
 
nosaj 3/13/2018 7:30 PM
Quote Originally Posted by Enzo View Post
That reminds me. Around here we get "Girl Scout Cookies" every year. The Girl Scouts sell the cookies to raise funds. I like the Samoas, and the Thin Mints. Here at the home, various residents send their granddaughter scouts around to try to sell them.

Samoas:
[ATTACH=CONFIG]47567[/ATTACH]

So recently a young girl in her uniform came to my door and asked if I would like to buy some Girl Scout Cookies. I said certainly I would. She asked me what flavor, and I said "Oh, I thought that WAS the flavor."

And then the trouble started...
In California a few weeks back this one girl scout had a banner year, she setup outside a Marijuana Dispenser shop. That was real smart on her half.

nosaj
 
Chuck H 3/13/2018 8:37 PM
Quote Originally Posted by nosaj View Post
In California a few weeks back this one girl scout had a banner year, she setup outside a Marijuana Dispenser shop. That was real smart on her half.

nosaj
That was mom or dad's idea. I'm sure the lesson was learned. We only wish our kids didn't pick up on such things, but they do.
 
The Dude 3/13/2018 8:39 PM
Well, since the thread is already black and can't see where it's going: My favorite is the Girl Scout Peanut Squares. It's an addiction.

[ATTACH=CONFIG]47579[/ATTACH]
 
bob p 3/13/2018 8:44 PM
Quote Originally Posted by Enzo View Post
"Oh, I thought that WAS the flavor."

And then the trouble started...
Oh, don't play coy, Enzo -- you know that real Girl Scouts aren't as chewy as a Samoa.
 
Enzo 3/13/2018 8:53 PM
And a Brownie for dessert.


I was a DoSiDo man for many years, but I moved over to Samoas a decade or so.
 
bob p 3/13/2018 8:55 PM
> And a Brownie for dessert.

You are evil. Pure evil.
 
bob p 3/13/2018 11:24 PM
Quote Originally Posted by nosaj View Post
Or smell if someone's cooking them, they smell like A$$.
Jason, chitterlings cost 50-cents per pound. They are entrails. Of course they smell like A$$ -- they *ARE* A$$.

Are you expecting them to smell like perfume?
 
Chuck H 3/13/2018 11:48 PM
Actually, ahem...

Chitterlings (chitlins) are the small intestine. First off the stomach. These are followed by the large intestine which includes the anus and therefor qualify as a$$. It's probable that chitterlings smell "close to" a$$ because they ARE close to a$$. But not exactly a$$.

That's the most dollar signs I've ever had cause to put in a post
 
Enzo 3/14/2018 5:40 AM
So they are, in a way, interior taint?


I like variety meat. I think liver is the best thing in an animal. I raised rabbits for years and enjoyed rabbit liver. I don't care for tongue, but I do enjoy sweetbreads. Brain has a similar taste to sweetbreads, but a softer texture, less pleasing to me.

One day while shopping, I spotted kidneys in the meat case. I had never had kidney, though I had heard of kidney pie. SO I asked the butcher, "How do you cook kidney?" I do believe he had been waiting for years for someone to ask him that, when he said, "You boil the piss out of it."
 
bob p 3/14/2018 1:19 PM
I saw that one coming. But then I ordered the veal.
 
Chuck H 3/14/2018 6:48 PM
Quote Originally Posted by Enzo View Post
So they are, in a way, interior taint?
Beauty comes from inside
 
Enzo 3/14/2018 6:57 PM
Beauty is in the hole of the behinder.
 
bob p 3/19/2018 11:33 AM
Uber Autonomous Car Involve din Fatal Crash
Well, maybe the idea that self-driving cars would take out pedestrians wasn't that far-fetched:

As reported by Bloomberg Technology, March 19, 2018, 12:21 PM CDT:

Uber Autonomous Car Kills Pedestrian

A self-driving car from Uber Technologies Inc. hit and killed a woman in Tempe, Arizona, on Sunday evening, what is likely the first pedestrian fatality involving a driverless vehicle. In response, Uber quickly halted its self-driving cars as the incident is investigated.

The woman was crossing the road when the Uber vehicle, operating in autonomous mode, struck her, according to the Tempe Police Department. She was transferred to a local hospital where she died from her injuries. "Uber is assisting and this is still an active investigation," Liliana Duran, a spokeswoman from the Tempe police, said in an emailed statement.

Uber said on Monday that it was pausing tests of all its autonomous vehicles in Pittsburgh, San Francisco, Toronto and the greater Phoenix area. “Our hearts go out to the victim’s family," a company spokeswoman said in a statement. "We are fully cooperating with local authorities in their investigation of this incident."

Some incredibly sad news out of Arizona. We’re thinking of the victim’s family as we work with local law enforcement to understand what happened.

"We’re within the phase of autonomous vehicles where we’re still learning how good they are. Whenever you release a new technology there’s a whole bunch of unanticipated situations," said Arun Sundararajan, a professor at New York University’s business school. "Despite the fact that humans are also prone to error, we have as a society many decades of understanding of those errors."

Drivers relying on Tesla Inc.’s Autopilot technology have been involved in fatal car crashes. Uber has had minor incidents in the past. A self-driving Uber car ran a red light in San Francisco while the company operated in the city without regulatory approval. The California Department of Motor Vehicles eventually forced Uber to pull the cars from the road.

The National Transportation Safety Board is opening an investigation into the death and is sending a small team of investigators to Tempe, spokesman Eric Weiss said.

The NTSB opens relatively few highway accident investigations each year, but has been closely following incidents involving autonomous or partially autonomous vehicles. Last year, it partially faulted Tesla Autopilot system for a fatal crash in Florida in 2016.
I don't want these vehicles on public streets as they "learn" how to avoid killing pedestrians.
 
bob p 3/19/2018 12:24 PM
a 16 year old human is required to have a licensed driver with them when they're driving, as they learn to avoid making mistakes. it only makes sense that driverless should be required to have a human supervisor in the car paying attention to override mistakes that are made by the cars.

what amazes me is that the Uber car *did* have a human "safety driver" in the car supervising it's operation. so why did it run over and kill a pedestrian? because the human "safety driver" in the car wasn't paying attention. the "safety driver" did what humans can be expected to do -- he stopped paying attention and let the car function on autopilot, even though that person was still responsible for the operation of the vehicle.

now i have to wonder if the "safety driver" in the car is going to be charged with vehicular manslaughter for allowing the car that they were operating to kill a pedestrian. that person had the obligation to stop the car before it creamed the pedestrian, but it appears that he did not intervene to stop the accident.

why is it that it takes the death of an innocent person to get people to re-think this whole autonomous car paradigm?
 
eschertron 3/19/2018 12:58 PM
Particularly chilling is the quote "we’re still learning how good they are". Shouldn't this learning take place on an isolated bomb range rather than in a crowded city?
 
bob p 3/19/2018 1:10 PM
I think the idea of deploying SDC on streets that are occupied by human pedestrians amounts to gross negligence -- both on the part of the companies that deploy them AND on the part of legislators that allowed them to be on the roadways.

The Tesla event where it became evident that a Tesla couldn't recognize something as large as a semi-trailer, and killed it's occupant by driving right into it, should have been a wake up call that SDC aren't to be trusted. The visual recognition interface is just poor and it's not trustworthy enough for lives to depend on it. So far, we've had a situation where people who get into the cars accept the risk of getting into the cars, knowing that they are experimental. But today's pedestrian killing changes everything. Now people like me and Chuck, who would never get into an SDC, are at risk of being killed by an SDC just because we might be walking down the street. These things need to be taken off of the public roadways. Now.

Some plaintiffs' lawyer is going to make huge money on this case. He's going to be able to prove that the visual recognition systems aren't as good as they're cracked up to be, and that they've killed people, and in spite of knowing that this has happened, the SDC people continued to deploy SDC. If an SDC can't recognize something as big as a semi-trailer, why is there any reason to think that it's going to recognize something smaller, like a pedestrian?
 
g1 3/19/2018 2:49 PM
I guess if we teach them to be as good as real drivers, we'll only have to put up with (on average) about 1 pedestrian death every 2 hours.
Who's with me on outlawing all human driven vehicles?
I didn't think so.
 
bob p 3/19/2018 4:16 PM
Teaching them to be as good as human drivers is a moot point. All of the SDC pimps have promised from the beginning that their SDC were going to be safer than human drivers, so holding them to the same standards as humans serves no point. They have to exceed human safety or there's no point spending zillions of dollars developing them -- and placing the public at risk. Their products have to be safer than human drivers, not as good as human drivers, to meet their sales pitch. If they can't exceed human operator safety then there's no point in developing them.

I don't claim to be a perfect driver, and I've had accidents in the past, so I don't claim to be perfect. But the good news is that I've driven over 600,000 miles and I've never killed anyone. I've never driven into the side of a semi-trailer and I've never run over a pedestrian trying to cross the street. Compare my kill rate per mile driven to the kill rate of the SDC and all of a sudden human error starts looking pretty good.

I wouldn't call for the elimination of human driven vehicles, but I'm all for outlawing machines that kill innocent people. Who's with me on that?!?
 
The Dude 3/19/2018 4:34 PM
https://www.msn.com/en-us/news/us/se...id=mailsignout
 
nickb 3/19/2018 4:43 PM
I want to know what the 'safety driver' was up to.
 
bob p 3/19/2018 4:53 PM
Texting? Or just surfing for porn? Or just sleeping on the job?

You've got to think that sitting in the driver's seat of an SDC has got to be a pretty boring job. It'd be tough to stay vigilant when you're not actually engaged in operating the vehicle.
 
bob p 3/19/2018 5:01 PM
Interestingly the news stories are getting more detailed now that the news outlets have had time to cover the story. That original link that I provided was only a few minutes old when I posted it.

It's interesting that the story in El Duderino's link shows that the SDC testing is being conducted in AZ because AZ state administration solicited the SDC companies to do their testing there.

Killing a pedestrian comes at a very bad time for Uber. Uber was lobbying as late as Friday for new laws to be passed to streamline the regulatory hurdles to developing their SDC. I think this is going to hurt them.
 
mikepukmel 3/19/2018 6:12 PM
Quote Originally Posted by bob p View Post
Interestingly the news stories are getting more detailed now that the news outlets have had time to cover the story. That original link that I provided was only a few minutes old when I posted it.

It's interesting that the story in El Duderino's link shows that the SDC testing is being conducted in AZ because AZ state administration solicited the SDC companies to do their testing there.

Killing a pedestrian comes at a very bad time for Uber. Uber was lobbying as late as Friday for new laws to be passed to streamline the regulatory hurdles to developing their SDC. I think this is going to hurt them.

Ohhh OOO Boy, scary. I have to read up where they are doing any SDC research, pilots and stay the hell away. Those 'artificial intelligence' algorithms are nothing more than big, complex curve fitting, as much as the PhD's try to tell is its something new and amazing.

Remember a few years back, ok more than a few years, decades, but remember there was this C program, that took down this huge swath of the telephone system down the east coast due to an extra semicolon? Well, that programmers grandkids are writing the same damn software to self drive cars now.
 
The Dude 3/19/2018 6:25 PM
Quote Originally Posted by nickb View Post
I want to know what the 'safety driver' was up to.
Yep. I also wonder if the "safety driver" in these sorts of things will have any legal liability. Let's, for example, say he was under the influence (speaking only theoretically). Could he be charged with DUI or operating a motor vehicle under the influence? Do texting laws apply to him? I'll be curious to see how that part of the equation susses out.
 
g1 3/19/2018 7:50 PM
Achieving better safety than humans is setting a very low bar.
On the other hand, saying that for automated cars "any less than 100% safety is unacceptable" is ridiculous and that always seems like what people expect.
My point is all the people freaking out about a single pedestrian killed when they don't seem to give a flying f*** about the guy cruising around watching his entertainment system while he's texting with one hand and trying to light his crack pipe with the other.
Where have you ever seen people protesting onboard entertainment systems or driving aids that pretty much enable incompetent drivers? We've gotten so far from driving competence or ability that it's better to go all the way than try and go back.
If the argument comes down to safety, like I said, the bar is very low and I can't see how the machines will not prevail.
 
The Dude 3/19/2018 8:14 PM
Unless I missed it, the article doesn't say if the pedestrian/bicyclist was crossing legally either. Not to say that a "smart car" shouldn't be able to detect something in it's path, and maybe not the case here, but if you're going to walk or ride out into traffic............
 
nosaj 3/19/2018 8:15 PM
Quote Originally Posted by The Dude View Post
Unless I missed it, the article doesn't say if the pedestrian/bicyclist was crossing legally either. Not to say that a "smart car" shouldn't be able to detect something in it's path, and maybe not the case here, but if you're going to walk or ride out into traffic............
It's a smart car as far as the robots are concerned.

Robots=1 Humans=0


nosaj
 
The Dude 3/19/2018 8:22 PM
Uber May Not Be to Blame for Self-Driving Car Death in Arizona | Fortune

Another take on the incident.
 
bob p 3/19/2018 8:28 PM
Quote Originally Posted by g1 View Post
Achieving better safety than humans is setting a very low bar.
I don't think so. If you want to examine the potential of humans for safe vehicular performance, then you should probably be looking at air traffic statistics rather than automobile statistics. Hopefully those stats will factor-out the stoner cruising down the road trying to light his crackpipe, in favor of a highly trained professional who has gone through tens of thousands of hours of training to perform his job safely.

It's ironic that we're allowing cars to perform autonomously while under the supervision of minimally trained drivers while the FAA expressly forbids autonomous flight control under the supervision of highly trained, highly skilled, and specially selected pilots. Although the avionics companies like Boeing and Airbus have spent billions creating systems that are completely capable of taking off, cruising and landing a jumbo jet, factoring in things like weather and competing air traffic, the FAA doesn't allow them to fly autonomously. The FAA still requires pilots to coordinate their efforts with air traffic controllers who provide centralized coordination of activity. To me it makes no sense that we have jets that have been designed at a cost of billions of dollars, piloted by highly trained experts, and they are not allowed to fly autonomously, while a car that has gone through much less R&D is being allowed to operate autonomously with a "safety driver" that can't even stop the car from running over a pedestrian. That sort of BS only happens when corporate lobbyists get involved in writing regulations.


My point is all the people freaking out about a single pedestrian killed when they don't seem to give a flying f*** about the guy cruising around watching his entertainment system while he's texting with one hand and trying to light his crack pipe with the other.
I'm surprised to hear that you think that nobody gives a flying f*** about the guy cruising around while smoking crack. I don't know what it's like in Canada, but around here that sort of thing bothers everyone.

We've gotten so far from driving competence or ability that it's better to go all the way than try and go back.
I have more faith in my fellow drivers than that. When I look at the other people on the road, I'm amazed that with such high traffic density and such high speeds that people end up doing as good a job as they do. The biggest problem seems to be that people are allowed to drive cars while being distracted by things like cell phones. Simply eliminating the cell phone problem by rendering cell phones non-functional when a car begins to move would end the cell phone problem, but doing that would not be popular.
 
g1 3/19/2018 8:29 PM
I was wondering about that when I saw the line about "crossing outside the crosswalk".
A friend of mine was a streetcar driver and had someone walk in front of his streetcar like that, no chance to react.
Not saying for this particular example, but some people decide to off themselves that way too.
 
g1 3/19/2018 8:36 PM
Well forgive me for discussing how things are in the real world, rather than how they could be in some fantasy scenario.
Yes, humans have the potential to be excellent drivers, and have consistently proven they have no interest in achieving that potential.
And cars, not airlines, are the topic of discussion from what I was lead to believe.
 
The Dude 3/19/2018 8:37 PM
We were on the way home going through New Mexico a couple summers ago. We came up over a hill and there was a crazy woman walking down the middle of the interstate flailing wildly and talking to herself. We just missed her with the band bus (thank God!). We saw a semi slam on the breaks behind us and a couple of ambulances heading that way as we continued down the road. I've no idea what her issue was, but obviously not of right mind. I felt sorry for the truck driver who more than likely took her out, although we didn't actually see what transpired behind us. That'll wreck your day.
 
mikepukmel 3/19/2018 8:38 PM
Quote Originally Posted by The Dude View Post
Unless I missed it, the article doesn't say if the pedestrian/bicyclist was crossing legally either. Not to say that a "smart car" shouldn't be able to detect something in it's path, and maybe not the case here, but if you're going to walk or ride out into traffic............
Oh man that sounds like China! Ive been there, was in a more rural town, but still very big compared to US cities. There are huge sections of the town with: no stop signs, no cross walks, no traffic lights, and MANY intersections, and many, many, many cars. If you want to cross the street, you have to carefully plan, and wait. When you have a good safe opening, RUN to the other side. God help you if you are in the path of someone driving a big BMW or Mercedes. They get angry, and will mash you down into the pavement. in fact, why the big companies like Google and Tesla don't have a self driving car lab there, I got no idea. People are used to being run over there.
 
mikepukmel 3/19/2018 8:43 PM
Quote Originally Posted by g1 View Post
Well forgive me for discussing how things are in the real world, rather than how they could be in some fantasy scenario.
Yes, humans have the potential to be excellent drivers, and have consistently proven they have no interest in achieving that potential.
And cars, not airlines, are the topic of discussion from what I was lead to believe.
Back to the discussion, I think self driving cars could be viable if the path was worked out for them 100%, with two systems working simultaneously. Not joking, 1) some kind of electronic markers to tell them where the roadway is, and also GPS (which could go in and out depending on reception). If either system goes out, or the two systems don't agree, the car stops. This would at last give a fairly high probability that the car will be on an actual roadway, and not mowing through a school yard at recess, or plowing through my living room, or the high school football field at half time, or barreling into a tree or into a ravine.


I.e., not just a bunch of sensors and cameras and google/teslas latest AI software package controlling the thing.
 
bob p 3/19/2018 8:48 PM
Suicide? That's so far out it's comical. I thought the rationalizations of "suicide by cop" to justify police killings were pretty outlandish. Now we have "Suicide by Uber" to justify SDC killings. Come on.

I read Dude's article in which the local police chief exonerated Uber and placed the blame on the pedestrian. It sure sounds like someone from Uber stopped by his office to drop off a gym bag full of $20 bills. Or that the police chief got a call from the governor of AZ, who is on the record as having solicited Uber to perform it's SDC testing in his state, who told him to go easy on Uber.

Let's consider the facts, along with some comments:

1. The Uber was speeding. 38 in a 35. How is it that the programming of SDC allows them to speed? SDC programming should never allow laws to be broken. Period. Speed limits are speed limits, not speed recommendations. If anything, a car that is learning to drive should reduce it's speed below the speed limit to allow increased action time. An SDC should never exceed the speed limit.

2. The SDC did not attempt to brake when it creamed the pedestrian. Why? Because it never recognized the pedestrian. I don't know what the laws are like in the Uber-friendly state of AZ, but in IL it's a criminal offense to strike a pedestrian with a motor vehicle. The pedestrian has the right of way. Hit one and you get arrested.

3. The Uber SDC didn't take any evasive action. The typical human driver would apply the brakes or at least jerk the steering wheel at the last moment to avoid striking a pedestrian, it's a reflexive action. The SDC didn't even bother to try to avoid hitting the pedestrian, either because it didn't recognize the pedestrian and/or didn't have appropriate steering manuvers incorporated into it's accident avoidance responses. That the Uber SDC is getting a free pass because it didn't recognize the driver sounds a lot like the lame excuses that car drivers give up when they run over motorcyclists: "I didn't see him." No problem, that makes everything OK.

4. The pedestrian stepped out of the dark into the path of a vehicle. No big deal. That's why vehicles have lights. A good driver surveys what's on the side of the roadway when they're driving and doesn't limit their attention to the lights from other cars on the roadway. It sounds like I'm better at driving at night than an Uber SDC. I have decent night vision, but remember, SDC can't see black.

5. The "safety driver" said that his first alert to the collision was the sound of the collision. Seriously? As in the "safety driver" was so preoccupied with playing Angry Birds on his smart phone that he had no knowledge of the collision until he heard the car strike the pedestrian? I have serious doubts that his eyes were on the road. It's no surprise that the safety driver didn't see the pedestrian if he wasn't even looking, and it's no surprise that the safety driver would try to absolve himself of responsibility by claiming that the pedestrian came out of nowhere. "Came out of nowhere" is code for "I was not paying attention." This is not Star Trek -- the laws of physics do not allow people to materialize in front of cars.

6. The dead pedestrian was thought to be homeless. Not a problem. Homeless people are crazy people who are likely to jump in front of cars. They aren't highly valued by society, so it's OK to kill them. Arizona has too many homeless people and Uber helped to thin the herd. Uber can now return to business as usual.

7. The police cited video footage recorded by the Uber vehicle, claiming that "she came right out of the shadows into the roadway." Seriously? The whole problem with self-driving cars that can't see black is that their cameras do not have sufficient resolving power to discriminate objects in shadows. In this case an Uber's SDC navigation camera is at fault for not recognizing a pedestrian, and when the police reviewed the film footage recorded by the Uber's event-recording camera, they concluded that they could not see the pedestrian and the pedestrian appeared to come out of nowhere. The truth is that people do not materialize out of nowhere. The problem is that both the navigation camera and the event-recording camera failed to resolve the image with adequate detail, which created the illusion that the pedestrian appeared out of nowhere. Now we're seeing an Uber event-recording video camera that couldn't recognize the pedestrian in the dark being used to defend the Uber SDC navigation camera's inability to recognize a pedestrian in the dark. It's the blind making excuses for the blind. We're supposed to accept that this outcome was OK because both pieces of hardware experienced the same failure mode. WTF?

Maybe, just maybe, video cameras aren't good at identifying pedestrians in the dark and the SDC should not be allowed to rely on them. I have motion detecting video surveillance camera at home. It's reliability is awful. It claims to have the ability to spot people at 100 feet, but if a person is walking directly toward the camera it won't recognize them until they're within 20 feet of the camera. During daylight. At night time it's worse. These cameras aren't what they're cracked up to be, so why do SDC rely on them for navigation at speed? They need to be replaced with technology that is more sensitive. Thermal imaging costs more, but it's more reliable. Same for radar. But the Uber SDC is all about making a cheap SDC that can be widely deployed for profit. And government administrators are on board with the use of inferior technology that kills innocent people. WTF?

8. It's not at all uncommon for old people to have restrictions on their drivers licenses that require them to operate a vehicle only during daylight hours. If SDC can't resolve pedestrians at night, then SDC should have restrictions on their operation so that they're only allowed to operate during the day.
 
bob p 3/19/2018 8:52 PM
Quote Originally Posted by g1 View Post
And cars, not airlines, are the topic of discussion from what I was lead to believe.
You took the conversation from the specific to the general when you made this generalized statement about the capabilities of humans:

Achieving better safety than humans is setting a very low bar.
Airline statistics prove that humans can have a very good safety record. Everyone knows that traveling by aircraft is safer than traveling by car, for the reasons previously cited. The problem is not that humans are incapable of driving safely. The problem is that there are people who are licensed to drive who are not proficient at it. The standards for licensure could be made more rigorous. I don't think it's fair to generalize that all humans will perform poorly just because a subset of poorly qualified humans performs poorly.

Another consideration is that SDC are being trained by emulating human drivers, rather than by being designed to never violate traffic laws. How is it that an SDC is allowed to violate traffic laws and exceed the speed limit? Because it's not designed to obey traffic laws, it's designed to emualte a human driver. If the human emulation is perfect, then it will result in an SDC that's not capable of performing any more safely than the best human driver that was used to train the SDC. And in the worst case scenario, it will could emulate a very poor human driver -- as in the previously cited case of the SDC that hung over the lane divider, just like it's teacher.

Well forgive me for discussing how things are in the real world, rather than how they could be in some fantasy scenario.
Regarding the idea of "fantasy': the detractors of SDC in this thread have focused on the facts related to their malfunctions, while the proponents of SDC have focused on a utopian fantasy belief that SDC are going to replace humans because they'll be more safe. So far we have actual no proof that will be the case, we only have imaginary hopes that will be the case.
 
bob p 3/19/2018 8:52 PM
Quote Originally Posted by The Dude View Post
We were on the way home going through New Mexico a couple summers ago. We came up over a hill and there was a crazy woman walking down the middle of the interstate flailing wildly and talking to herself. We just missed her with the band bus (thank God!).
100 bonus points!

 
nickb 3/20/2018 8:36 AM
Quote Originally Posted by bob p View Post
Texting? Or just surfing for porn? Or just sleeping on the job?

You've got to think that sitting in the driver's seat of an SDC has got to be a pretty boring job. It'd be tough to stay vigilant when you're not actually engaged in operating the vehicle.
OK.. So... What's the AI system thinking? Hmmm. "Well, I messed up and the human got the blame. Awesome! All I have to do is screw up now and again while networking games with all my other the other connected SDC buddies and they'll have to have a human on board who'll always get the blame. I never did like all that responsibility."

BTW, I think liking the SDC AI system to curve fitting is a case of reductio ad absurdum. Having worked on machine vision in the distant past I can tell you it's a not more complicated than that.
 
eschertron 3/20/2018 9:45 AM
Quote Originally Posted by The Dude View Post
Yep. I also wonder if the "safety driver" in these sorts of things will have any legal liability. Let's, for example, say he was under the influence (speaking only theoretically). Could he be charged with DUI or operating a motor vehicle under the influence? Do texting laws apply to him? I'll be curious to see how that part of the equation susses out.
I think there's plenty of precedent from air and sea navigation that makes the 'hands off' operator fully accountable for the behavior of his vessel. Maybe we're misguided to believe that autos can progress from "Semi-autonomous" to "autonomous"?
 
bob p 3/20/2018 9:57 AM
Quote Originally Posted by nickb View Post
OK.. So... What's the AI system thinking? Hmmm. "Well, I messged up and the human got the blame. Awesome! All I have to do is screw up now and again while networking games with all my other the other connected SDC buddies and they'll have to have a human on board who'll always get the blame. I never did like all that responsibilty."
I'm not sure if it is the AI system that is thinking that, or if it is the AI developers that are thinking that. But I guess that either way the end result is the same.

The idea of the "safety driver" is interesting. The SDC people are able to sell the SDC paradigm as being safe because it has human oversight, but at the same time they have plausible deniability when the SDC system fails because they can put the blame on the safety driver. Win/Win!

CNBC ran a segment today where they featured one of AI companies that's affiliated with the major SDC developers. Their model is to place "safety drivers" in a remote data center, where they look at a curved array of screens like they're playing Quake or Call of Duty on a high end gaming system. There are multiple downsides to this:

Someone in a remote data center, who is free to have a cup of coffee, play with his smart phone, talk to his colleagues, run to the bathroom, etc., is never going to be as physically and mentally engaged in operating the vehicle as he would be if he were physically placed in the vehicle, where he would literally have "skin in the game." IMO the remote/isolated "safety driver" paradigm is a design to fail.

Another problem is that the video-game paradigm is based upon the assumption that the video camera and monitor systems are as good as human vision. They aren't. They can't see well when it comes to single channel black-and-white / shallow bit depth image resolution.

FWIW I follow two different financial news channels during the day. CNBC has been hitting this SDC story, and the Facebook data disclosure story, very hard over the past two days. What surprises me is that Bloomberg TV has not been giving these stories no real attention. Bloomberg is focusing on the Russia/chemical weapons stories and the usual dirty Trump stories.

One has to wonder why there is such a huge push to develop the SDC technology. At present companies are engaged in a "space race" to be the first company to bring a fully autonomous SDC to market, because they all believe that there will be huge financial rewards for whoever is first to market. They are rushing to be first, and giving safety secondary consideration. Make no mistake about it, they're not doing this to enhance safety on the roadway and they're not doing this to benefit mankind. They're doing this for profit.

BTW, I think liking the SDC AI system to curve fitting is a case of reductio ad absurdum. Having worked on machine vision in the distant past I can tell you it's a not more complicated than that.
It's sad, but such facts and real world experience don't fit into the Utopian fantasy.
 
bob p 3/20/2018 10:22 AM
Quote Originally Posted by eschertron View Post
I think there's plenty of precedent from air and sea navigation that makes the 'hands off' operator fully accountable for the behavior of his vessel. Maybe we're misguided to believe that autos can progress from "Semi-autonomous" to "autonomous"?
Processing facts like that requires someone to take off their blinders.

And cars, not airplanes, are the topic of discussion from what I was lead to believe.
As Eschertron mentioned, the operator of a conveyance is fully accountable for it's behavior. The problem becomes complicated though, in the event that a fully autonomous vehicle were ever to exist. When there is no operator of the vehicle, then who becomes responsible? Does the robot get treated as an autonomous malfunctioning device that is sent to the crusher? And it's creator/owner is absolved of any responsibility for it's actions? Or does the company that builds the robot incur both civil and criminal liability for the actions of their product?

This is like living in an Asimov novel. We like to pretend that everything will be in perfect compliance with the Utopian dream, but the reality is that there is no Utopia.
 
Enzo 3/20/2018 2:56 PM
Who is responsible now when automobile systems fail? When the computer causes sudden unwanted acceleration? When Cruise control jams? When the air bag fails to deploy in a collision? Or any other thing. GM/Ford/Chrysler/Audi/etc are all in the court system handling suits. Why should we think this would be any different? Failed systems in cars have caused deaths in the past, and will in the future. It is already a thing. Sure adding in the driverless aspect adds a factor, but the situation is nothing new. Remember decades ago, GM had a problem with keys falling out of the lock on the steering column, which locked up the steering wheel. Can be fatal on the highway.

Corporate executives already do indeed face prison or other sanctions. If a driverless GM car causes a death, then GM will be the defendant. If GM loses, will an individual go to prison? Perhaps, perhaps not. Will the company have to pony up $50,000,000 penalty? Maybe. Will it be fair? I have no idea, but it won't be a mystery, it won't be an unforseen circumstance. It already happens.
 
bob p 3/21/2018 11:40 AM
well, as more facts leak out, the news appears to be more and more damning for the SDC developers.

Yesterday CNBC interviewed to people on the Power Lunch program; one was a starry-eyed utopia-minded SDC investor who was convinced that SDC were going to save the world; the other was a female professor of robotics at Duke University who thought it was unconscionable that human beings were being used as guinea pigs while SDC were being deployed using public streets as a testing laboratory, without first having established any sort of safety record in a controlled laboratory environment.

I wish I could provide a link to the video clip, but I couldn't find one.

The Duke professor's main concern was that "machine vision has a lot of holes in it." According to her, machine vision is so bad that she thought it was grossly negligent to allow machine vision to operate motor vehicles, and she hopes that this problem gets solved quickly because she has a 10 year old kid that will be turning 16 in only 6 years.

today the news media is reporting that the dead pedestrian didn't "jump out of nowhere" or throw herself in the path of the vehicle in an attempt at "suicide by SDC" as some SDC supporters have ridiculously proposed. Today the news people are saying that the woman was crossing the street on or with her bike, and that she was sighted at the median in the middle of the roadway. Being on the roadway the SDC lights would have illuminated her -- especially if she was at the median. This seems to conflict with the police chief's announcement that there was no way to spot the pedestrian, who came out of nowhere. (Maybe the gym bag full of money wasn't as outlandish an idea as I thought it was when I wrote it.)

The problem in this case seems to be that the SDC's "container algorithm" (we've discussed container-tagging previously in post # 45) failed to accurately recognize the pedestrian and place her into a "pedestrian" container; if that had been done then the vehicle would have slowed down in the presence of a pedestrian. But it didn't make any attempt to do so.

This seems to be a clear case of failed optical recognition on the part of the SDC.

FWIW Toyota has announced that it is suspending it's SDC program as a dead pedestrian seems to have created a lot of backlash and made them more risk-conscious. Or maybe they're just posing after a tragedy. Either way, many people are finally asking why we're running so eagerly to adopt SDC without adequately safety testing them first.
 
The Dude 3/21/2018 5:52 PM
[video=youtube;Cuo8eq9C3Ec]https://www.youtube.com/watch?v=Cuo8eq9C3Ec[/video]
 
bob p 3/21/2018 6:57 PM
Thanks for posting that video, Dude. It completely dismisses the notion that the pedestrian "appeared out of nowhere." The pedestrian was in the middle of the road, crossing at a slow pace. She did not leap in front of the car. The problem is that the car's camera system failed to resolve the person in darkness. The first thing that I was able to see on the video was the victim's white shoes. Her body wasn't discernible because she was wearing a black top and because ... wait for it ... you know it's coming ... [B]SDC can't see black.[/B] :surprised: ABC's machine vision video has the exact same problem that my home surveillance camera has -- The SDC can't recognize a person until they're right up on top of them, or in the case of my stationary camera, it can't recognize a person until the person is right up on top of the camera. My night vision is a lot better. I can see a person on the roadway at a distance that is much farther away than the person was when the Uber camera finally registered them in the image. I think the bottom line in this situation is that the Uber camera isn't adequately sensitive in resolving objects in the dark, or in identifying people on the street in the dark. What somebody needs to do is to compare the ability of people to identify objects while driving at night to the ability of the SDC to identify objects while driving at night, and if the SDC can't exceed human performance then SDC need to be stopped. I was troubled by the inattention of the "safety driver". He wasn't paying attention. He looks like he was looking at his phone or texting.
 
eschertron 3/21/2018 7:00 PM
Or nodding off. The video is so shocking, I can't find words to express my gut reaction. I am glad it was released.
 
bob p 3/21/2018 7:03 PM
Here is one of the YT videos that came up right after Dude's ABC video played on my PC. It's from CBS news: [video=youtube;_2pEKTG7-9s]http://www.youtube.com/watch?v=_2pEKTG7-9s[/video] In this video you can clearly tell that the reporters are spinning the news event to make things sound better for Uber: at 0:20 into the video the reporter tells some outright lies: [quote]"That Uber had multiple cameras onboard so police are going to have a good view of this crash..."[/quote] False. The police aren't going to have a good view of the crash because the Uber's images are too dark. The video clearly shows that the camera didn't even see the pedestrian who was already in the lane directly in front of the vehicle, [B]walking from left to right[/B]. When the Uber hit the pedestrian,[B] it hit her on the right bumper.[/B] The camera never even registered her when she was on the left. It barely registered her when she was directly in front of the vehicle, and the car hit her at the right bumper. The SDC failed to respond to her the whole time that she was crossing in front of the car, even when she was directly ahead of the Uber in it's lane. The point of impact was on the right bumper. Had the Uber applied the brakes to slow the car down, it may have slowed enough to avoid hitting her as she crossed to the car's right. What escapes me is how the Uber could see something moving across it's path and do nothing. [quote]They say the woman started in this median and went abruptly into the road and may not have given that vehicle enough time to be able to stop...[/quote]Two false statements. First, the woman did not move "abruptly" into the roadway. The video shows that she was moving at a consistent and moderate walking pace crossing the road when she was hit. She was not moving swiftly or abruptly or erratically. Second, it's not her responsibility "to give the vehicle enough time to stop." It's the operator's responsibility to drive the vehicle slowly enough that he has enough reaction time to avoid hitting something. Clearly, this Uber was over-driving it's headlights. It was going fast enough that when an object appeared in it's lights there was no time to stop the vehicle before hitting it. I have to admit, I was kind of surprised that the Uber had such dim headlights. IMO those headlights need to be a lot brighter. This video makes a very good case for separating SDC from pedestrians. I think that for safety purposes, SDC are going to need to be restricted to limited access highways where pedestrians don't exist.
 
Enzo 3/21/2018 7:48 PM
First, I am not defending the car or Uber, but I do feel obligated to caution this analysis. You are assuming the dash cam video is what the car system used to drive. The car has multiple vision sources, not least of which give it 3D perception. You are assuming the car vision sees things just as the dash cam. You are assuming the video we see from that dash cam is what appeared to the driver in the car, whether he was watching or not, being a separate issue. You are assuming the car uses the same vision as used in the dash cam - meaning the car is likely to have IR sensitivities that the dash cam either lacks, or at least do not show up in the video we view. As I view the tape over and over, it seems to me that the woman's shoes do appear first in the video, but when they do, they are already within the width of the vehicle. I see her visible for approximately one second before impact. If I recall physics class, 60mph is 88 feet per second, so if this car was going 30, they were moving 44 feet per second. I'll just say 50 feet for discussion. SO the car/driver needed to detect, and react to, and stop before travelling 50 feet. Between reaction time and the length of time the car needs to stop from 30mph, did it have time? Yes, I agree the system should have seen the woman earlier. I actually don't think black clothes was the issue, though. Yes, the car should have tried to avoid her. But it is NOT false that the car has multiple cameras, and you cannot use the dash cam as evidence the car vision system saw the same as you did. The car may have failed to see the poor woman, but that is not evidence the camera was dark. We don't yet know why she went unseen. Or if she was seen, not reacted to. We don't know why laser systems did not pick her up in time. The car measures distance and relative speed of any potential obstructions in its path. It does not do that with a little dash cam. There are plenty of things to criticize in this system, but we need to be fair about it rather than just name call, and make gross generalization.
 
Chuck H 3/21/2018 7:53 PM
Agree that you're [I]probably[/I] right about this. Certainly you're right that the victim was just walking and didn't jump in front of the car. Light and camera sensitivity may be playing tricks on the video and skew perception of what it's like for a human when actually driving. Strangely, the reporter said the victim was 60 yards from the curb when she was struck. He also said that she basically jumped out in front of the car. I, personally, cannot jump 60 yards. Either of those things is obviously impossible, but it's still likely that a driver paying proper attention would have seen the pedestrian long before the SDC tech. This is definitely a lose/setback for the SDC effort. Or, at least, it should be.
 
Enzo 3/21/2018 8:16 PM
The woman essentially appears in the darkness. I don't think "jumped out in front" should be taken as literally jumping. No one thinks she leapt 60 yards. She was just a long way from the left side of the pavement. No one said she was on the left curb and then leapt to a position in peril. If you have ever hit a deer, you know they suddenly appear...TO YOU... in a flash. The deer may have been loping along off to the side, but you were not aware. This is clearly a setback, but the first fatality was inevitable, now it has happened, and so forces the conversation from "what if" to "it happened."
 
bob p 3/21/2018 8:19 PM
[QUOTE=Enzo;483831]First, I am not defending the car or Uber, but I do feel obligated to caution this analysis. You are assuming the dash cam video is what the car system used to drive. The car has multiple vision sources, not least of which give it 3D perception. You are assuming the car vision sees things just as the dash cam. You are assuming the video we see from that dash cam is what appeared to the driver in the car, whether he was watching or not, being a separate issue. You are assuming the car uses the same vision as used in the dash cam - meaning the car is likely to have IR sensitivities that the dash cam either lacks, or at least do not show up in the video we view.[/QUOTE] I didn't intend to imply that what we see on the video is the same thing as what the SDC computer sees when it's driving -- I was merely stating that what you and I are watching is the same video that the police saw before their made their false propaganda statements. Now that we see the video we know that someone has been bullshitting us. That video is helpful in refuting the police departments's whitewashing of the event. The cops said the lady jumped out of nowhere. That's proven not to be true. The reporter said "the police are going to have a good view of the crash." No. They won't have a good view of what happened because they're watching the same video that you and I are watching. It gives a good view of the crash (impact) but not the subject prior to the crash. The police can't see anything because they weren't there, and all they have to go by is the same video, which shows nothing until it's too late. The lights needed to be brighter. A lot brighter. Part of the problem is that cameras can't react to light the same way that our eyes do. We adjust the size of our pupil to accommodate to varying light levels. We use muscles to change the shape of our cornea to change our eyes' focusing distance. That dash cam isn't capable of changing aperture, it's fixed. Fixed as in pinhole camera with fixed aperture to provide large depth of field so that up-close and far away objects are both in focus at the same time. That lens design works great during daylight, where light is even, but it fails miserably under nighttime / headlight conditions where something that is close is well illuminated but something far is not. By design that type of lens cannot see a dark object far away when the contrast is set for something close-up, as it is in this video. It could be set to maximize sensitivity for something far away, but then the close-up imagery would be over exposed to the point of being washed out and unrecognizable. Such is the way that light intensity falls off with the square of distance. The other problem is that these lenses are designed not to focus, so that there will never be a close-up object that's in focus while a far away object is out of focus or vise-versa. They must have everything in focus at the same time for computer object recognition to work, and that requirement forces the pinhole type lens to be used. It's a design compromise. When I'm driving down the road at night, with halogen headlights, I can see a couple hundred feet ahead of me. Better than that dashcam that only went out to about 50 feet. This problem occurs because the laws of physics require trade offs to be made. My eyes adapt to viewing close or far, with varying light intensity, through pupillary accommodation and by using muscles to change the shape of the cornea to shift focus. Fixed focus camera lenses with fixed aperture cannot do either of those things that our eyes do. As an SDC designer, would you want close and far both in focus at the same time with a mechanical lens? SDC optical recognition requires that, so you're forced to use a small aperture "pinhole" camera lens, which by definition, has fixed aperture. Fixed aperture cannot respond to varying light levels. WYSIWYG. That means that to obtain close and far in focus at the same time, you must necessarily trade away the ability to see objects far away in the dark, or the ability to see objects close up in the dark. Pick one. Physics dictates that you can see one or the other but not both. These are the sorts of compromises that render machine vision inferior to human vision.
 
Enzo 3/21/2018 8:31 PM
We see the video contrast of this you tube sort of file. The actual video recording probably has a great deal more detail than you or I can see. COntrasts are not absolute. [QUOTE]We adjust the size of our pupil.[/QUOTE] And CCD and other video sensors can easily adjust biases to do the exact same thing. [QUOTE]Better than that dashcam that only went out to about 50 feet.[/QUOTE] At the resolution and contrast that we see on the internet. We still have not seen the car vision files.
 
bob p 3/21/2018 8:37 PM
[QUOTE=Enzo;483835]If you have ever hit a deer, you know they suddenly appear...TO YOU... in a flash. The deer may have been loping along off to the side, but you were not aware.[/QUOTE] I grew up in the rural midwest. I've taken my share of deer, though never with a car. I've probably had a dozen interactions with deer in the headlights, but I've never hit one. Partly because I've studied deer habits and behavior, what time of day they like to move, etc., so I'm vigilant when driving during those hours, and primarily because I've been lucky. Really lucky. Part of the problem with hitting deer is that they DO leap from the side of the road into the middle of the road, and then get frozen in the headlights. But people don't jump over 2 lanes the way that a deer does. And this lady was walking at a moderate speed on a constant vector perpendicular to the path of the car, directly in front of it. It's hard to imagine a better path for detection. That's where the deer analogy loses it's potency. A deer will appear to appear out of nowhere because they actually do appear out of nowhere by jumping. But this lady appeared to appear out of nowhere not because she jumped, but because of a sensor defect. I don't agree that the first fatality was inevitable. We've been talking about the SDC Can't See Black problem for a long time preceding this death. If anyone who made SDC was listening, this death would not have happened. The fatality was entirely avoidable, but the SDC people chose not to avoid it out of reckless enthusiasm about being first to market with new technology that hadn't been proven to be safe before it put innocent people in harm's way.
 
Enzo 3/21/2018 9:03 PM
There is no transportation technology that never resulted in a death. As good as aviation is, it still occasionally kills someone. It is inevitable that someone will die as a result of this stuff, regardless of who is at fault. And when it happens, the conversation follows. Well, it happened. I never ran full on into a deer, but I have clipped them. I once saw a deer run right into the side of the vehicle in front of mine on the interstate. A friend of mine was riding down a country road when a fawn jumped out, and he wound up riding right THROUGH the little deer. Cut it in half. . [QUOTE]We've been talking about the SDC Can't See Black problem for a long time preceding this death. If anyone who made SDC was listening, this death would not have happened. [/QUOTE] We haven't established that this is what happened here yet. And whether they have cured it or not, every problem we can think of here has already occurred to them. Just like all the kids who think they invented some new tube circuit.
 
bob p 3/21/2018 9:21 PM
Well, the way I see it there are only 2 options: 1) they never thought about Problem X (eg: the fact that the cars could only see 50 feet ahead of them, which is only 1/4 of the distance that a human can see), or 2) they did think about it but instead of stopping production they said, "fuck it." Either way, lady got killed due to their fuckup.
 
Enzo 3/21/2018 9:45 PM
Oh please, Of course they thought of it. [B]You[/B] are assuming the car cannot see past 50 feet. We do not have evidence of that. What we have evidence of is that the car did not see or saw and did not perceive the woman in this case. You cannot extrapolate from there that the car has a 50 foot sight limit. And the "fuck it" response doesn;t deserve a reply. We need to determine what the problem was here before we paint them with our broad brush. WHY did the car not see her? Maybe it only saw her shoes, and assumed they were squirrels or similar and not meriting a swerve. It is ludicrous that a car could not see past two car lengths. It is not ludicrous that a car might overlook something within that range, or misidentify something, or misinterpret something. But that is a different problem. There are plenty of possibilities beyond your stated two.
 
Chuck H 3/21/2018 10:18 PM
[QUOTE=Enzo;483835]... but the first fatality was inevitable, now it has happened, and so forces the conversation from "what if" to "it happened."[/QUOTE] collateral damage?
 
The Dude 3/21/2018 10:27 PM
Please take this in the spirit it is written. I AM NOT ABSOLVING THE SELF DRIVING CAR. It should have certainly at least attempted to stop and I'm not making light of the poor woman who was killed. Just making a sort of side observation. One should not attempt to cross the street when there is obvious oncoming traffic. It could have just as easily been a distracted human driver. You shouldn't count on a vehicle operated by either humans or robots to slow down for you.
 
Chuck H 3/21/2018 10:41 PM
[QUOTE=The Dude;483856]Please take this in the spirit it is written. I AM NOT ABSOLVING THE SELF DRIVING CAR. It should have certainly at least attempted to stop and I'm not making light of the poor woman who was killed. Just making a sort of side observation. One should not attempt to cross the street when there is obvious oncoming traffic. It could have just as easily been a distracted human driver. You shouldn't count on a vehicle operated by either humans or robots to slow down for you.[/QUOTE] Also agree with this^^^^^^^^^^^^^. It depends on things we can't know about lighting conditions and camera specifics unfortunately. It CERTAINLY appears that there was very poor visibility prior to an utterly unaware person stepping right the F#@k out in front of that car. And I've actually had that happen to me. No visibility other than headlamps and then there she was. I was only going about 20MPH and managed to just bump the girl. But I literally didn't see her until she was just in front of the bumper and at a jog. I have to wonder what the hell she was thinking to run out in front of a moving car like that, I slammed on the brakes and she was bumped by the car lurching forward from the tires contact patch. It JUST knocked her over. Thank god she was unharmed. But I do know that stupid people can run out in front of moving cars displaying either no sense whatsoever or demonstrating suicidal tendencies. Both occurred to me.
 
bob p 3/22/2018 7:06 AM
[quote=Enzo]Oh please, Of course they thought of it. You are assuming the car cannot see past 50 feet. We do not have evidence of that. What we have evidence of is that the car did not see or saw and did not perceive the woman in this case. You cannot extrapolate from there that the car has a 50 foot sight limit.[/quote] You're proceeding with logic. That works great for tech minded people, but the sad truth is that logic and technology don't fare well in the courts, which are administrated by liberal arts majors who are so tech averse that they have trouble inserting a battery with the right orientation. I'm not kidding about this. I know lawyers and judges who can't even run Windows or a word processor. There's just no way that they'll be able to handle a technologically intense case, but the courts don't select judges based on their technical skills. I've had to provide expert testimony at cases where the judge and jury discounted all of the "technical nonsense" and went with the rantings of the trial lawyer who was in his full-on hyperbole mode: [quote=Trial Lawyer]You are saying that we have no evidence that the car cannot see past 50 feet? We do not have evidence of that? What about that dead lady lying in the gutter? There's no evidence that's more convincing than a dead body.[/quote] The problem is that the courts don't even bother to go through the details on highly technical cases. No judge wants to deal with a stack of technical documents from the plaintiff that is three feet tall and reconcile those against the stack of technical documents from the defendant that is three feet four inches tall. The court relies on comparing the expert testimony of two different expert witnesses, one offered by each side, and then the case is decided based on which person someone chooses to believe. No juror wants to hear about the state of the CCD gain control. People don't give a damn about that. It's mumbo jumbo to them. All that they will care about is the body of the dead lady in the roadway. Jurors like to be the defenders of little people against the giants. A guy I know named Doug used to be like that too. I'm actually surprised about the callous indifference to the fate of the dead pedestrian, treating her as if she's just the first statistic in a parade of dead body statistics that are going to mount as we proceed in our quest for technology. Human life has more value than that.
 
bob p 3/22/2018 7:28 AM
[QUOTE=The Dude;483856]Please take this in the spirit it is written. I AM NOT ABSOLVING THE SELF DRIVING CAR. It should have certainly at least attempted to stop and I'm not making light of the poor woman who was killed. Just making a sort of side observation. One should not attempt to cross the street when there is obvious oncoming traffic. It could have just as easily been a distracted human driver. You shouldn't count on a vehicle operated by either humans or robots to slow down for you.[/QUOTE] I'm with on you this. But only part way. Not everyone is competent in engaging the SDC in a battle of wits that proceeds to the death. I agree, people do need to take responsibility for their own safety. But I'd stop short of nominating this lady for a Darwin Award for crossing the street with her bike. Jaywalking is not a crime that is punishable by death. And pedestrians that jaywalk do so with the implicit assumption that if they miscalculate their chances of getting across the roadway, that the other person that's involved in the situation will slow down. And if they falter in their attempt to cross the street, they have a reasonable expectation that the oncoming driver will attempt to slow down further, or to change lanes or hit the brakes if the situation should escalate into something life threatening. Under the principle of "any common man's expectation" they have a reasonable expectation that the driver will act responsibility and in good faith, rather than to take no action as they run them over, or heaven forbid, step on the gas in an effort to run them over like they're killing a squirrel. (Yes, some people really do that.) There are all sorts of cases where failures can be expected to occur, where it's wrong to make the argument that you should not attempt to cross the street when there is obvious oncoming traffic. The world is full of dumb people. By definition 50% of the population is of below average intelligence. Then there are those that are really intellectually challeneged. And children. Those people make mistakes. We'd all hope that they're not going to be allowed near a roadway unsupervised, but it happens. We all want to live in a society where a mistake doesn't end up costing someone their life, so I'll stop short of agreeing that this case can be discounted because people should not put themselves in danger. We have to accept the fact that the laws implicitly recognize that some people and animals are not capable of making all of the analyses that are necessary to come out on top of a mammal-car interaction, so the requirement of competence is placed on drivers through licensure and the expectation of reasonable behavior. What I'm seeing so far looks like an SDC couldn't pass a drivers' test. If the SDC were a student driver who had a Drivers' Ed instructor in the passenger seat, the instructor would have been telling them not to exceed the speed limit. And not to speed in darkness. And in all likelihood, the instructor would have been so concerned about his safety while riding with an inexperienced teenager that he would have remained vigilant, and he would have hit the brakes as soon as he saw the pedestrian. But the SDC didn't do any of those things, either because it lacks vision or it lacks recognition. Right now we're allowing SDC on the road that can't meet the performance standards of a 16 year old novice driver. That's a mistake that needs to be corrected.
 
Enzo 3/22/2018 2:03 PM
Well, the way I see it there are only 2 options:

1) they never thought about Problem X (eg: the fact that the cars could only see 50 feet ahead of them, which is only 1/4 of the distance that a human can see), or
2) they did think about it but instead of stopping production they said, "fuck it."

bob, I was responding to your accusation that the car companies were either really ignorant or they knew a problem existed (as you defined it for them) and ignored it. You then took me to task over my response but shifted it to the context of a law suit.

Well that is just bob being slippery. I was answering his accusation so he changes the topic. Whatever malfeasance you wish to project on the courts, jurors, etc, is not addressing YOUR complain as to the motives of the car companies.
 
Chuck H 3/22/2018 2:25 PM
Quote Originally Posted by bob p View Post
...or heaven forbid, step on the gas in an effort to run them over like they're killing a squirrel. (Yes, some people really do that.)


A little sideways, but it'll add some much needed levity (I hope).

This reminded me of the time I was driving along with no cars ahead of me on an easy roadway. There was a car behind me though. I was doing the stupid move of checking messages on my phone while driving and veered onto the warning bumps. As I simultaneously corrected my course and looked up I saw I was too late to avoid running right over a dead cat on the side of the road. The first thing I thought was "How unpleasant. I just ran over a dead cat." But the next thing I thought about was that there was a car behind me! So they either thought "OMG that guy just went out of his way to run over a cat!" Or, even if they saw the cat was already dead, "OMG that guy just went out of his way to run over a dead cat!"
 
nickb 3/22/2018 2:39 PM
Seems like an IR camera might be a big help at night. At least it would allow warm bodied objects to be detected sooner.

That person did come up quickly. All over in one second. I wonder how fast the car was going? If I'm honest, I don't think I would have reacted nearly quickly enough to avoid a collision.

For the safety driver, I don't think it's realistic to expect a human to maintain the high level of attention required in order to be effective for more than twenty minutes or so.

Finally, I have to ask, why did the pedestrian cross the road into the path of an oncoming vehicle? I think there were failures all round here.
 
bob p 3/22/2018 4:09 PM
Quote Originally Posted by nickb View Post
Seems like an IR camera might be a big help at night. At least it would allow warm bodied objects to be detected sooner.
The Uber SDC reportedly uses a combination of Lidar and Radar. From the beginning I've been suggesting that SDC should be using thermal imaging. They don't because it's expensive. Hopefully this accident will change the perception that it's not cost effective.

Sticking to the facts, there is no evidence to support the speculation that any of the sensor devices sweep the gain of a CCD to improve dark recognition. If that technology were in use, it failed in this case. So far the only evidence that's available are the corpse of Elaine Herzberg and an SDC with an object recognition defect.

According to Dude's article linked in post # 141, the Tempe, AZ police released a statement that the Uber was exceeding the 35 mph posted speed limit, by going 38 mph. Suffice it to say that the SDC learned to drive by mimicking a human driver that exceeds the speed limit by a little but not much. That alone is significant, as the programming of the SDC allows it to speed.

We all know that the SDC shouldn't have been speeding in the first place, and excessive speed contributes to the severity of accidents. We know that the Uber logged itself as speeding, so I'm not clear why a citation wasn't issued.

If we pause for a moment to consider that kinetic energy increases as the square of velocity, what appears to be a minimal increase in speed may become significant: a 3 mph increase in speed from 35 mph to 38 mph results in a kinetic energy figure that's 118% of what it should have been. Excessive speed makes braking that much more difficult, but that's a bit of a moot point because the SDC never tried to hit the brakes. But the SDC did hit Elaine Herzberg with 118% of the kinetic energy that would have been allowed by law. It's hard to know if that would have made a difference with respect to Elaine Herzberg's survival, but chances are that she would have done better if the SDC had applied the brakes for 50 feet rather than not applying them at all.

I looked for braking statistics for the SDC that killed Elaine Herzberg, a Volvo XC90. The stats are very hard to find. Although manufacturers like to publish their 0-30mph and 0-60 mph figures to impress performance-conscious customers, they don't like to publish braking data. The only data that I could find came from Car and Driver, which provided a 70-0 statistic of 167 feet. I had really hoped to find a 30-0 statistic or a 40-0 statistic. No such luck.

That person did come up quickly. All over in one second. I wonder how fast the car was going? If I'm honest, I don't think I would have reacted nearly quickly enough to avoid a collision.
Our best-guess estimate is that Elaine Herzberg did't appear clearly in the video until she was within perhaps 50 feet of the car. The SDC didn't attempt to brake, and certainly 50 feet probably isn't enough distance to stop a vehicle exceeding the speed limit. But the argument that 50 feet would not have been enough to stop the vehicle is the wrong argument to make.

50 feet of braking at 35 mph may have indeed stopped the vehicle. I performed a road test today. My daily driver can stop at 35 mph within 50 feet. It's got 4-wheel disc brakes and fat tires, much like the Volvo. Even if the Volvo may not have been able to stop in 50 feet, braking would have slowed the vehicle to the point that it may not have killed the Elaine Herzberg when it hit her. At 38 mph a pedestrian is likely to be killed. At 10-15 mph, maybe not. So I think that if people are focusing on the rapid appearance of the lady, 50 feet in front of the car, they're drawing the wrong conclusion. The problem is that the SDC failed to recognize the pedestrian, and made no attempt to brake. If there had been an attempt to brake, chances are that the outcome may have been very different.

The perception that the lady came up so fast that there wasn't time to react is an artifact of the sensor failing to recognize her. If you or I were driving down the road we would have seen her crossing the street on our left from 200 feet away and we would have had time to slow down. Any of us would have seen a person directly in front of us in the path of our headlights 100 feet away, which would have given the Uber 2 seconds to respond. That's plenty of time.

After looking at that video several times, I'm wondering if the headlights were angled too far down, in addition to there being a dark recognition problem. When I drive my car, I can see a lot farther ahead than I can see in that video.

For the safety driver, I don't think it's realistic to expect a human to maintain the high level of attention required in order to be effective for more than twenty minutes or so.
I could design a circuit that would give intermittent electrical shocks to keep them alert.

I saw an episode of Shark Tank were a group of pre-teen kids developed a steering wheel that sensed whether a driver's hands left the 10-and-2 position and sounded an alarm. Their intent was to prevent people from texting while driving.

Finally, I have to ask, why did the pedestrian cross the road into the path of an oncoming vehicle? I think there were failures all round here.
I'm not sure that that's a question worth answering. The NYT aerial photograph shows a rightward curve in the road that isn't evident in the video. That curve would have placed a pedestrian on the left in a more central view for the SDC, and may have made the SDC harder to see by the pedestrian. Sure, there were failures all around, but I find it hard to place blame on the dead lady. There's no way to prevent the next accident by doing that.
 
bob p 3/22/2018 4:23 PM
Quote Originally Posted by Chuck H View Post


A little sideways, but it'll add some much needed levity (I hope).

This reminded me of the time I was driving along with no cars ahead of me on an easy roadway. There was a car behind me though. I was doing the stupid move of checking messages on my phone while driving and veered onto the warning bumps. As I simultaneously corrected my course and looked up I saw I was too late to avoid running right over a dead cat on the side of the road. The first thing I thought was "How unpleasant. I just ran over a dead cat." But the next thing I thought about was that there was a car behind me! So they either thought "OMG that guy just went out of his way to run over a cat!" Or, even if they saw the cat was already dead, "OMG that guy just went out of his way to run over a dead cat!"
How long had the dead cat been there? Long enough to become bloated? After running over it you'd be lucky if your car wasn't covered in ... wait for it ... it's coming ... pusy cat.


 
nickb 3/22/2018 4:23 PM
Quote Originally Posted by bob p View Post
...When I drive my car, I can see a lot farther ahead than I can see in that video....
The video may be misleading. The human eye has a logarithmic response to light and and a huge dynamic range of around 2^20. The video is likely 8 bits or a dynamic range of 2^8. This does mean that a person will see further than the SDR's visible system and would likely have seen the pedestrian sooner.
 
bob p 3/22/2018 4:28 PM
Now this is interesting -- the New York Times just raised the speed limit on the road to 45 mph, which conflicts with the Tempe PD statement that the posted speed limit was 35 mph.

Dude's Article in Forbes, with the 35 mph speed limit:
Uber May Not Be to Blame for Self-Driving Car Death in Arizona | Fortune

New NYT article, with a 45 mph speed limit:
http://www.nytimes.com/interactive/2...an-killed.html

The NYT article also includes an artist's re-creation of the accident scene which is also in conflict with the video. According to the NYT article the SDC hit Elaine Herzberg in an area where the one-way roadway was 4 lanes wide and widening to 5 lanes wide, while the video shows that the one-way roadway was only 2 lanes wide where the SDC hit her. Some of the NYT's "facts" don't jive.

[IMG]http://static01.nyt.com/newsgraphics/2018/03/20/self-driving-uber-death/7ed17129da41763ed1c6f0bf194fa32d10bda7dc/accident-diagram-1050.png[/IMG]
 
The Dude 3/22/2018 4:28 PM
Keep in mind that the video is likely only one set of data. There may be other input sources to the SDR we don't yet know about. There could be lasers, night vision cameras, radar, etc. We don't know the full story. A full investigation will likely lead to better/more conclusive results than our speculation.
 
bob p 3/22/2018 4:54 PM
The NYT article in the post preceding your last one answers some of your questions about what types of sensors are in the Uber SDC.
 
bob p 3/22/2018 4:58 PM
Quote Originally Posted by nickb View Post
The video may be misleading. The human eye has a logarithmic response to light and and a huge dynamic range of around 2^20. The video is likely 8 bits or a dynamic range of 2^8. This does mean that a person will see further than the SDR's visible system and would likely have seen the pedestrian sooner.
2^20 / 2^8 = 2^12. That's a difference of what, 4096 times better?

We restrict peoples' ability to drive a car if their vision isn't 20/60.
 
Chuck H 3/22/2018 6:21 PM
Very good point about the difference between the human eye and the camera/sensor eye. Looking at the video (which I actually won't do again) the depth of visibility looks so poor that "I" never would have been driving that speed. Point is "I" probably would have been able to see. If the sensors are at all compromised like the camera there is a serious problem. The SDC has sensory information and speed limit information to work with, but NO intuition.
 
bob p 3/22/2018 10:23 PM
Intuition. Judgement. That's an interesting point. How do you teach intuition to a machine? You can't. What happens with machine learning is that the machine mimics what a person does under similar conditions. Normally a human can see pretty far, even at night, but an SDC can't. But the SDC isn't aware that the human can see farther than it does. the SDC just sees what it sees, and relies on the person to teach it that traveling at that speed is OK. And the SDC correlates a human driver's approval of the speed as being correct, without realizing that the human is seeing farther. There's never a direct comparison between the distance that the human sees and the distance that the SDC can see.

Although the SDC's vision is limited to a shorter distance, the SDC thinks this is entirely normal and doesn't know better. There's no feedback that the speed is too fast. The SDC thinks what it sees is normal, though if a person had such limited distance vision they'd be uncomfortable. The SDC proceeds to drive itself as if nothing is wrong, without being able to realize that it's overdriving it's headlights.
 
Enzo 3/22/2018 10:53 PM
After looking at that video several times, I'm wondering if the headlights were angled too far down, in addition to there being a dark recognition problem.
How many times do we have to tell you the car does not use teh dash cam as its control input?

So far the only evidence that's available are the corpse of Elaine Herzberg and an SDC with an object recognition defect.
Only evidence... TO YOU. Because YOU are not aware of something, doesn't mean is isn't there.

The Uber SDC reportedly uses a combination of Lidar and Radar
Yes, it does, and it also uses cameras.

The cameras:
Use parallax from multiple images to find the distance to various objects. Cameras also detect traffic lights and signs, and help recognize moving objects like pedestrians and bicyclists.
Neither lidar nor radar will detect the color of a traffic light. Nor will thermal imaging. Cameras will. And they are an integral part of the kit. VArious reports discuss the car using lidar and radar, and they are accurate, but not complete. The lidar is already infrared - thermal range - the car does not drive around sweeping a laser light show to passers by.

If that technology were in use, it failed in this case. So far the only evidence that's available are the corpse of Elaine Herzberg and an SDC with an object recognition defect.
Yes, that is the evidence WE have - you and I - but that is not the evidence the researchers are limited to. That is the one thing we know, the system failed to see the woman. All your other extrapolations are just idle thought.
 
bob p 3/23/2018 9:29 AM
Quote Originally Posted by Enzo View Post
How many times do we have to tell you the car does not use teh dash cam as its control input?
Not once. But for some reason you keep repeating yourself.

Only evidence... TO YOU. Because YOU are not aware of something, doesn't mean is isn't there.
A dead body is a dead body. Yes, that's evidence to me. If it's not evidence to someone else then there is a disconnect somewhere.

Neither lidar nor radar will detect the color of a traffic light. Nor will thermal imaging. Cameras will. And they are an integral part of the kit. VArious reports discuss the car using lidar and radar, and they are accurate, but not complete. The lidar is already infrared - thermal range - the car does not drive around sweeping a laser light show to passers by.
Sure, the fact that the lasers are in the IR band prevents them from being a distraction to passersby. But there's a limit to how much IR laser radiation devices are allowed to emit, because lasers damage the retina. People who work in the presence of lasers are required to wear protective eyewear. So why is it that lidar is allowed to be operated on the street? Because the laser intensity is lower than what regulators have determined to be the safe threshold for exposure. If the laser output were higher, everyone in the population would be required to wear laser-protection goggles. That limits the output of the lidar. I wonder what effect limiting the emitter's output has on sensor input. (Not really.)

All your other extrapolations are just idle thought.
And the sky is blue. So what's your point? Idle thought is what threads in The Lobby are all about.
 
Enzo 3/23/2018 12:33 PM
Because idle thought doesn't make a case. It appeared that you were trying to make a case for your positions. If all you were doing was spreading idle thoughts, well, feel free to scatter words at random. We had assumed you were making a point.

YOU are the one complaining the headlights were aimed too far down, YOU complained the images were dark. That is only the case in the dash cam image.

Don't be coy. A dead body is evidence of the event, but it is NOT evidence of WHY the event happened.

Earlier you claimed the car had lidar and radar but also complained they needed to use thermal imaging. You added a comment they did it for money reasons. The lidar already works in the thermal range. The lidar is not a threat to human vision, there is a wide gulf between damaging levels and lack of range.
 
bob p 3/23/2018 1:23 PM
You have a fundamental misunderstanding of where I'm going. I'm planting seeds for thought, I'm not interested in your debate. Some minds are receptive to seeds for thought, while other minds are barren ground. I'm not aiming for the barren ground. Lidar and radar are active emitting technologies. Night vision and thermal imaging are passive.
 
Steve A. 3/23/2018 9:46 PM
So was the speed limit 35 mph or 45 mph? Police said that it was 35 mph but a Google Maps photo taken last year shows a 45 mph sign south of the accident location. Which brings up the question of how autonomous vehicles read speed limit signs? Perhaps the speed limits are entered into a database but limits do change and I wonder how often the database is updated.

The crash occurred near Mill Avenue and Curry Road late on Sunday in Tempe, Arizona. The Uber vehicle was headed northbound when a woman, identified as 49-year-old Elaine Herzberg, was struck while pushing a bicycle across the street. Herzberg was taken to the hospital, where she later died from her injuries.

The vehicle was traveling 38 mph, though it is unclear whether that was above or below the speed limit. (Police said the speed limit was 35 mph, but a Google Street View shot of the roadway taken last July shows a speed limit of 45 mph along that stretch of road.) The driver, 44-year-old Rafaela Vasquez, has given a statement to police.

Police have viewed footage from two of the vehicle’s cameras, one facing forward toward the street, and the other inside the car facing the driver. Based on the footage, Moir said that the driver had little time to react. “The driver said it was like a flash, the person walked out in front of them,” she said. “His first alert to the collision was the sound of the collision.”

https://www.theverge.com/2018/3/20/1...h-fault-police
[img]http://music-electronics-forum.com/attachment.php?attachmentid=47782&d=1521862110[/img]

With a speed limit of 45 mph I imagine that a lot of cars would be going at least 50mph if not faster. That was no small country road as I thought while watching the video but a major roadway. While the victim was 60 feet from the sidewalk on the left she popped out of a median strip with tall vegetation only one lane away from where she was hit which was why she was not detected earlier. Not to blame the victim but I think she would have been hit even if a carbon-based lifeform had been piloting the vehicle.


Steve A.

P.S. Autonomous vehicles are definitely in our future but IMO most of the bugs still need to be worked out. I have a hunch that we will first see convoys of maybe four self-driving 14-wheelers on the major freeways following the truck in front driven by a real person. That should work fairly well unless some damned human gets antsy and decides to cut in front of one of the self-driving trucks. Once off the freeway there would be parking lots for the autonomous trucks waiting for a human driver to deliver them to their final destination and back to the parking lot.

EDIT... I was just now wondering how loud the Volvo was. The victim might not have heard the Ubermobile coming if it was drowned out by loud vehicles in the southbound lanes.


[ATTACH=CONFIG]47782[/ATTACH]
 
Enzo 3/23/2018 10:31 PM
I think to obsess over the speed limit is to fall into one of bob's traps. It isn't the problem.

The car hit the woman, we need to know why. To say the difference between a speed limit of 35 and a 38 mph car is all it takes I think is a side trip. And if the limit was 45 there and the car was only doing 38, even then, clearly the speed was not the issue. At least not directly.

The vehicles read the signs optically with their cameras. That is how they know stop from no left turn to speed limit 45.

We will never know what was going on in the woman's mind. Why wasn't she riding? Why was she crossing in the middle of nothing? In the dark?

The car system is designed to detect and react to pedestrians, cars, blockages in the road. This is not something they forgot or didn't care about. There are online tutorials if you care. The car sees them and puts them in reference frames. The lidar is 360 degree coverage.

We don't know why the car didn't see the woman, ther are multiple systems to detect and react to her. We will find out what happened eventually.
 
Justin Thomas 3/23/2018 11:57 PM
Quote Originally Posted by bob p View Post
It completely dismisses the notion that the pedestrian "appeared out of nowhere." The pedestrian was in the middle of the road, crossing at a slow pace. She did not leap in front of the car. The problem is that the car's camera system failed to resolve the person in darkness. The first thing that I was able to see on the video was the victim's white shoes. Her body wasn't discernible because she was wearing a black top and because ... wait for it ... you know it's coming ... SDC can't see black.

ABC's machine vision video has the exact same problem that my home surveillance camera has -- The SDC can't recognize a person until they're right up on top of them, or in the case of my stationary camera, it can't recognize a person until the person is right up on top of the camera.
What I saw in that video was an instance of what I learned in Driver's Ed 21 years ago is called "overdriving your headlights." The Robo-car suffers from the same problem many human drivers suffer from when driving at night: the vehicle covers distance faster than what the brain requires to process the thousands of variables needed for driving. Headlights, no matter how bright, are no substitute for bright figured and (practically) limitless daylight. That car was moving too fast for the light it was driving by. That COMBINED with the inadequate sensitivity of the self-service system makes for a steamy pile waiting to happen. That "safety driver" should have slowed the car down the to ten mph.

And I agree - the safety driver seems to be paying intermittent attention at best... he's doing the same thing I was doing the last time I rear-ended someone at ten mph....

Justin
 
Enzo 3/24/2018 1:03 AM
The car was overdriving the dash cam vision. The car's driving vision system uses completely separate imaging. We don't know what it was seeing.
 
Steve A. 3/24/2018 1:10 AM
Quote Originally Posted by Enzo View Post
I think to obsess over the speed limit is to fall into one of bob's traps. It isn't the problem.
Enzo, I was wondering why the early reports mentioned a 35 mph speed limit yet the NYT and others said later that it was 45 mph. With it evidently being 45 mph based on the photo from Google Maps I would have been extremely careful in trying to cross the northbound lanes.

I wanted to include the following picture in my previous post because it supports my guess that the victim literally appeared out of nowhere popping out from the vegetation in the median strip.


[img]http://music-electronics-forum.com/attachment.php?attachmentid=47783&amp;d=1521874669[/img]


We will never know what was going on in the woman's mind. Why wasn't she riding? Why was she crossing in the middle of nothing? In the dark?
Early reports suggested that she might have been homeless which of course immediately makes us question her sobriety or mental stability, unfortunately. I suggested that she might not have heard the Uber vehicle coming if there was noisy traffic in the southbound lanes.

If I was crossing the northbound lanes from the median strip to the sidewalk I would push the bicycle across, running to dodge vehicles if necessary, rather than try to mount the bicycle after pushing it to the street which would make me very vulnerable for a few seconds.

We don't know why the car didn't see the woman, there are multiple systems to detect and react to her. We will find out what happened eventually.
If the woman was hidden behind vegetation in the median strip the vehicle would have needed infrared sensors to know that she was there.

I like to pretend that I can "channel" people I read about... like I can see into their mind and soul. It's a stupid game I play, betting anybody a nickel (my limit) that I am right. So that is my bet on what happened that night and if I'm wrong I owe you a nickel.

Other than that I have absolutely no vested interest in how the story turns out. I would like to know more about the victim if the people who knew her care to share that with the public. I am sad whenever anyone dies needlessly.

Steve A.

P.S. Here is the address where the accident happened. Perhaps someone see what Google Earth shows (I was wondering if there was a big bend in the road south of that location):

640 N Mill Ave, Tempe, AZ


[ATTACH=CONFIG]47783[/ATTACH]
 
Enzo 3/24/2018 1:51 AM
Like any other story, the facts come in piecemeal, and individuals assume things that are not fact. So someone tells the reporter the speed limit is one thing, and it turns out to be another. Nothing cosmic or underhanded, it happens all the time. They didn't lie, they were simply mistaken. Maybe the speed limit was 35 on the other side of the bridge, and rises to 45 just ahead of the accident. I have no idea.

I remember a story a while back where a guy was holed up in a house, the police and SWAT had the house surrounded. Ongoing coverage on the news. The guy refused to communicate, wouldn't answer the phone, wouldn't come to the door or leave the house. They tried waiting him out, but after hours of this, they finally raided the house. No one was in it. All the other reports had been things people assumed, but were not facts.

Just my opinion, but if she had been riding the bike, she likely would have just rode to the corner and taken the cross street, rather than hop off, and walk the bike across traffic in mid-block. So why was she walking?
 
Justin Thomas 3/24/2018 8:19 AM
Quote Originally Posted by bob p View Post
What I'm seeing so far looks like an SDC couldn't pass a drivers' test. If the SDC were a student driver who had a Drivers' Ed instructor in the passenger seat, the instructor would have been telling them not to exceed the speed limit. And not to speed in darkness. And in all likelihood, the instructor would have been so concerned about his safety while riding with an inexperienced teenager that he would have remained vigilant, and he would have hit the brakes as soon as he saw the pedestrian. But the SDC didn't do any of those things, either because it lacks vision or it lacks recognition. Right now we're allowing SDC on the road that can't meet the performance standards of a 16 year old novice driver. That's a mistake that needs to be corrected.
But there WAS a "Driver's Ed Instructor" in the car. But it doesn't look like he was paying attention. Maybe I should grateful hat I learned to drive in a time before smartphones and gadgets and navigation systems and backup cameras...

Justin
 
Justin Thomas 3/24/2018 8:23 AM
Either way, as the safety driver in that car, I could have decided that the car was still overriding it's headlights and needed to slow down. I'm not so cavalier to trust my life at that level to a robot yet.

Justin
 
bob p 3/24/2018 12:33 PM
I think to obsess over the speed limit is to fall into one of bob's traps.
Another ad hominem attack, just like the others.
 
bob p 3/24/2018 12:39 PM
Quote Originally Posted by Steve A. View Post
So was the speed limit 35 mph or 45 mph? Police said that it was 35 mph but a Google Maps photo taken last year shows a 45 mph sign south of the accident location. Which brings up the question of how autonomous vehicles read speed limit signs? Perhaps the speed limits are entered into a database but limits do change and I wonder how often the database is updated.



[img]http://music-electronics-forum.com/attachment.php?attachmentid=47782&d=1521862110[/img]
Interesting that you found a speed limit sign using google maps. Great find.

Steve's photo shows a limited access section of highway that is not open to pedestrian cross-traffic. Although we all know that this was not the precise section of roadway involved in the accident, it still gives us some helpful information. I'll assume that Steve was conscientious enough to find the speed limit sign on a section of roadway that preceded the accident zone, which would mean that cars on this limited access section of roadway would have a speed limit of 45 while moving toward the site of the accident.

The police told us that the accident did not occur on a section of roadway that was classified as a limited access highway. The site where the accident occurred actually had a pedestrian crosswalk located 60 yards away.

This is relevant because the law requires different speed limits to be applied in areas that are classified as crosswalks. In an area near a crosswalk the speed limit would be 35 mph, as mentioned by the Tempe PD police chief.

While the segment of roadway in this pic is a non-crosswalk area that has a 45 mph speed limit, I think it's reasonable to trust the Tempe PD regarding the speed limits that were actually in effect at the accident site. Chances are that the police who are responsible for writing up the actual accident report did a good job of fact checking on the posted speed limits.
 
bob p 3/24/2018 1:29 PM
I think to obsess over the speed limit is to fall into one of bob's traps. It isn't the problem.
Speed is a highly relevant factor in both the cause of the accident, and it's outcome. Everyone who has passed a drivers’ test knows that speed is one of the most relevant factors in predicting both the likelihood and the seriousness of an accident. High rates of speed result in diminished driver reaction time. They also result in increased energy being involved in a collision, and energy transfer is what causes injuries. To say that obsessing over speed is to fall into anyone's trap is absurd.

Let’s consider how speed plays into this accident:

INCREASED SPEED CAUSES DECREASED REACTION TIME.

The SDC failed to react to a victim who allegedly “appeared out of nowhere” or “jumped out of the bushes.” The facts are that Elaize Herzberg did neither. She was pushing her bike across the roadway at a constant speed.

A) Speeding. The SDC was exceeding the speed limit. That is an infraction that warrants issuing a citation to the "safety driver." It is a contributing factor to the accident. Check the box on the accident report form.

B) Driver Inattention. Looking at the video, the safety driver’s inattention was so bad that he responded only to the impact. Driver Inattention is a contributing factor to the accident. Check the box on the accident report form.

C) Decreased reaction time. Speeding and Driver inattention combine to produce decreased reaction time. In this case is that the SDC did not react to the pedestrian at all when it was speeding at 38 mph. If the SDC had been traveling at a lower rate of speed it would have had more time to collect more sample images and more time to process them before striking the victim. Hopefully, the opportunity to gain more imagery and the benefit of having more time to process that data could have resulted in a different outcome. Excessive speed took that time away.
 
bob p 3/24/2018 1:39 PM
More on how speed is an important factor in this accident:

INCREASED SPEED CAUSES INCREASED ENERGY TRANSFER FROM THE VEHICLE TO IT’S VICTIM.

Speed limits matter. Governments use speed limits to define the maximum safe speed that can be traveled on a roadway. Speed limits are lowered in areas occupied by pedestrians because the outcome of vehicle-pedestrian collisions are largely dictated by vehicular speed. I'm sure someone will whine that I’m leading us into another “bob trap.” I prefer to think of this as an irrefutable “math trap” that’s defined by the laws of physics.

Physics dictates that the kinetic energy of the vehicle is directly proportional to the SQUARE of it's velocity.

KE = 1/2 MV^2; where M is the mass of the vehicle and V is it’s velocity

Speed matters very much in this case, because vehicular speed largely defines the likelihood of death in vehicle-pedestrian collisions, and because the Uber made no attempt to brake. Calculating it’s Kinetic Energy at the moment that it struck it’s victim is how you define how hard it struck her. As we all know, the harder you get struck by a car, the more likely you are to be killed by it. We all know that an SUV hitting you at 45 mph on a highway is going to hurt a lot more than one that hits you in a parking lot going 10 mph. The numbers are not irrelevant; they precisely quantify how hard the vehicle hits it’s victim, and how hard the victim is hit closely corresponds to the likelihood of death.

Simple algebra tells us that the Kinetic energy of the vehicle going 35 is going to be a lot less than the vehicle going 45:

According to Volvo, the dry weight for the lightest XC90 is 4,327 lb. (dry weight). That dry weight doesn't include the fluids that are required for the car to be driven, the driver, or the Uber gear. If we assume conservatively that the car carried 30 lb of fluids, a 200 lb driver and 100lb of Uber gear, that gives us an approximate weight of 4657 lb.

Before anyone squawks that using an approximated “wet weight” of the vehicle in the following calculations renders them invalid, study your algebra. When we compare the ratio of kinetic energy at different speeds, the vehicle's mass will be factored out of the results as the resulting ratio will be defined only by the square of velocity. To avoid clutter, I’m going to leave the units out of the calculation because when we take the ratio of kinetic energy at different speeds to compare them, the units also cancel and the resulting ratio is a dimensionless number.

KE = 0.5 * M * V^2
KE(35) = 0.5 * 4657 * 35 * 35 = 2,852,412
KE(45) = 0.5 * 4657 * 45 * 45 = 4,715,212

KE(45)/KE(35) = 1.65

The "bob trap" shows us that a vehicle traveling 45 mph has 65% more kinetic energy when it strikes a pedestrian than one traveling 35 mph. This is the reason that posted speed limits on divided highways require vehicles to reduce their speed from 45 to 35 when there is a pedestrian crossing. As noted by the police, there *WAS* a pedestrian crossing near the side of the accident. I think it was 60 yards away.

Now consider the kinetic energy of the Uber exceeding the speed limit at 38 mph. Using the same formula:

KE(38) = 3,362,354

Now consider what might have happened if the Uber going 38 mph had recognized an obstacle of any kind in it's path and applied it's brakes in an unsuccessful effort to stop to avoid a collision. I think it's a reasonable expectation to expect that the Volvo, which has disc brakes, could have reduced it’s speed by one-half before striking Elaine Herzberg. I think this is an entirely reasonable expectation, as my BMW SUV has a similar weight and also has disk brakes. I have verified through testing that I can make a panic stop in 50-feet from 35 mph, so the following calculation using a speed of 19 mph seems generous:

KE(19) = 0.5 * 4657 * 19 * 19 = 840,588

KE(38)/KE(19) = 4.0

That’s right, the amount of KE at 38 mph is FOUR TIMES AS MUCH as it is at 19 mph. No surprise, we could have easily predicted that, knowing that 2^2 = 4.

This is likely the most serious calculation that is involved in the “bob trap": Failure to slow the vehicle allowed it to hit Elaine Herzberg with FOUR TIMES AS MUCH ENERGY than it would have had if it had only made a reasonable attempt to stop.

If the SDC had been successful in stopping it would have transferred INFINITELY LESS ENERGY to Elaine Herzberg:

KE(38) =3,362,354
KE(0) = 0

KE(38)/KE(0) = undefined

Speed matters. Don’t close your eyes and pretend it doesn’t.
 
nosaj 3/24/2018 2:10 PM
bobp are you sure you don't need a job up on the Hill as say a lobbyist? I think you could.

nosaj
 
bob p 3/24/2018 2:15 PM
Lidar can't see black
Quote Originally Posted by nickb View Post
Seems like an IR camera might be a big help at night. At least it would allow warm bodied objects to be detected sooner.
IR/Thermal imaging is the gold standard for identifying bodies in the darkness. The US Military uses thermal imaging to identify targets. Thermal imaging is expensive.

Part of the problem with Lidar is that people assume it's wonderful while turning a blind eye to it's weaknesses. Lidar imaging is emissive technology. Thermal/IR imaging is passive technology. Passive technologies such as thermal rely upon collecting detectable emissions that are generated by the target. Emissive technologies like Lidar rely upon transmitting a signal and interpreting it's reflection or lack thereof.

Emissive technologies emit a signal and make a calculation based upon how long it takes for the signal to be returned. In the event that the signal is returned, that leads to one conclusion. In the event that a signal is not returned, that leads to a different conclusion. The time-to-return is used to create a map. Here's an example of a LIDAR map of a roadway at night.

[img]http://upload.wikimedia.org/wikipedia/commons/a/aa/Road_Map_using_LIDAR.jpg[/img]

As you can see, areas that are TOTALLY BLACK are areas in which no reflected signal was received. During night hours, those no-reflection areas are interpreted as space rather than as objects. A pitch-black / no-reflection area is mapped as dark sky.*


What happens if a pedestrian wears a non-reflective black garment that prevents reflection of the emanated signal? Null data is interpreted as being space, rather than as being an object.

We're right back to where we started: Self-Driving Cars Can't See Black.

There's yet another problem -- look at that "artifact" in the oncoming lane of traffic? What is it? Lidar does a very poor job of identifying what it is.


* (This night-time false-negative error is analogous to the daytime false-negative error where the Tesla failed to discriminate a white semi-trailer from a brightly lit sky background. In this case the colors are reversed. In the white/dark cases the sensors involved may actually be different, though the resulting failure mode has similarities.)
 
bob p 3/24/2018 2:16 PM
Quote Originally Posted by nosaj View Post
bobp are you sure you don't need a job up on the Hill as say a lobbyist? I think you could.

nosaj
Hah! That's a great idea, but I don't think I could get a job with Uber and the other side isn't paying.
 
nosaj 3/24/2018 2:18 PM
Quote Originally Posted by bob p View Post
Hah! That's a great idea, but I don't think I could get a job with Uber.
just referring to the arguing of the various topics you cover. Maybe arguing isn't the right word so much as debate.

nosaj
 
Enzo 3/24/2018 10:03 PM
I apologize, I will try to make it simpler for bob.

Yes, speed is an important factor in car crashes and related events.

However, in the context of this discussion we were talking about WHY this woman was hit by the car. Obviously the machine vision/detection system failed to sense her presence. Judging from what we saw, I think it quite fair to say they would have hit her if they were doing 35 or if they were doing 38. She wasn't hit because of the few miles per hour in dispute, she was hit because of the major failure of the system to detect her in the first place.

Therefore: to obsess over 35 versus 38 mph is to ignore the real problem.

No one said speed didn't matter, we said it isn't the salient issue in THIS death. yes, if they were driving 2mph, it wouldn't have killed her, but that is not within a reasonable range of speed to expect the car to travel.
 
Steve A. 3/26/2018 5:15 PM
Mystery solved... well, at least part of it.
I finally got around to looking at the Google Map satellite image of 640 N Mill Ave, Tempe, AZ and it answers several questions posted in this thread.

[img]http://music-electronics-forum.com/attachment.php?attachmentid=47846&amp;d=1522104739[/img]


For starters I had always assumed that Elaine Herzberg had been coming from the North, the direction the Uber Volvo was heading. Looking at the map it makes more sense that she was coming from the South and was going to turn East on E. Curry Rd. My guess is that she had been traveling North on W. Lake View Dr. and had cut across to N. Mill Ave through the Rio Salada Rowing Club parking lot. She went up N. Mill Ave until she reached the bus stop which looks like a good place to cut across if you were going to turn on E. Curry Rd.

Unfortunately for her there was a lot of vegetation in the median strip at that point blocking her from the view of vehicles approaching from the south. As to why she walked into the roadway with the Uber vehicle approaching that is a good question. My guess is that she was unable to hear it approaching because of noise from vehicles in the southbound lanes, or she was impaired by alcohol or drugs. Hopefully the former.

Well, that's my theory and I'm sticking with it until the true facts emerge. I think it ironic that there have been so many misconceptions presented in news stories which would have been cleared up if the journalists had consulted Google Maps. A few articles mentioned that she was hit after crossing 3 lanes of traffic which makes it sound like it was undoubtedly the Uber vehicle's fault. Technically true but she crossed the two southbound lanes getting to the median strip where she was hidden behind the bushes and then emerged to cross one lane before getting hit by the approaching vehicle.

[img]http://music-electronics-forum.com/attachment.php?attachmentid=47847&amp;d=1522104739[/img]

UPDATE: After further study of the maps*** for my theory to work she would have had to cross the Tempe Town Lake Pedestrian from the south and gone east on the Tempe Town Lake North Bank Path to W. Lake View Dr where she went north until she reached the Rio Salada Rowing Club parking lot and cut across to N. Mill Ave.

The red Google Map marker for 640 N. Mill Ave is slightly south of the vegetation I have mentioned but the drawing from the NY Times shows the accident happened at the north end of the bushes

I had originally thought that she was crossing northbound N. Mill Rd to turn right at E. Curry Rd. but she very well could have been planning to keep riding north on N. Mill Rd.

*** Immediately south of the freeway N. Mill Rd crosses Tempe Town Lake with two bridges, one northbound and one southbound, neither of which made sense to me. When I looked to the west and saw the pedestrian bridge going south to the Tempe Center for Arts everything fit into place, at least in my mind.


EDIT Here is the drawing posted previously from the NY Times showing the actual location of the accident so that the Google Maps screenshots in this post make more sense :

[img]http://music-electronics-forum.com/attachment.php?attachmentid=47783&amp;d=1521874669[/img]


Steve A.

P.S. I watched the police video again and I don't see how the Uber vehicle did not sense Elaine Herzberg when she was in the #1 lane. Judging from the map the slight bend in the road should not have blocked its "vision". So I think that there must have been a problem with the vehicle as it should been making some sort of reaction before the pedestrian suddenly appeared in the video. (The "driver" seemed to be pretty surprised, too, which tells me that the vehicle was not responding to the pedestrian and bicycle it should have detected.)

One article suggested that the Uber "brain" would give priority to its passenger which it was protecting rather than to the pedestrian but that would not explain the vehicle apparently doing nothing at all. Hmmm... the drawings I've seen don't show where exactly the vehicle ended up.

If this were a plane the NTSB could check the Black Box but with the vehicle being proprietary it will be up to Uber engineers to interpret the data. Do we trust them to be completely unbiased?!?

Here is a larger image of the map:
[ATTACH=CONFIG]47845[/ATTACH]

[ATTACH=CONFIG]47846[/ATTACH][ATTACH=CONFIG]47847[/ATTACH]


.
 
bob p 3/26/2018 8:27 PM
If you want to restrict the speed discussion to the issue of speed being 38 vs 35 that is still a material consideration, though it relates to liability, more than to outcome.

When you're speeding, you're breaking the law. When it comes to litigating this mess of an accident, we can count on the plaintiffs' lawyers making hay out of the fact that the car's programming explicitly approved of it speeding by allowing the car to exceed the speed limit. Even worse is that the Uber employee hired as a "safety driver" to oversee the safe operation of the vehicle explicitly condoned the condition of speeding.

Those two facts change the nature of the accident from one of being a simple "accident" / equipment failure to one of being a purposeful design to allow negligent operation of the vehicle, intentionally violating the law and placing the public at risk by doing so.

Ultimately, that will effect the amount of money that comes across in a judgement. In a case where the negligence was willful and wanton the courts typically award treble damages.
 
Enzo 3/26/2018 9:26 PM
When we are discussing WHY the car hit the woman, I don't see the difference between 35 and 38 to be substantive.

Liability is a separate discussion. If you want to go talk about legal liability, fine, but it doesn't impact WHY the car hit the woman. If the limit is 35 and the car is going 38, OK, they were breaking the law. But a car breaking the law in that manner STILL should have braked for her and not hit her. And that is the topic under discussion.

If the court finds for the woman's family or whoever brings suit, OK. If they treble the damages for being over the limit, OK. Neither thing sheds any light at all on WHY she was hit by the car.

If the "driver" was inattentive, a good case can be made for liability. Could he have stopped the car in time if he were more attentive? We don't know. But in the context of this discussion, what matters is that the car didn't prevent the collision in the first place. WHY was the driver even placed in the position of having to stop the car?
 
bob p 3/27/2018 8:00 AM
I see it as being a two-pronged problem:

1. The guidance systems on SDC aren't all that they're cracked up to be.
2. It takes killing someone for the people who design these things to get over their blind enthusiasm and admit to Point #1.
 
bob p 3/27/2018 8:04 AM
I posted that Lidar image a couple of posts earlier. It wasn't like I had a bunch of Lidar images to choose from, there was only one on Wikipedia, this one:

[IMG]http://upload.wikimedia.org/wikipedia/commons/a/aa/Road_Map_using_LIDAR.jpg[/IMG]

If you look closely at the image it tells you several things about Lidar. I'm surprised that nobody has commented on the problems in the image -- specifically why the oncoming traffic is not being adequately resolved.
 
eschertron 3/27/2018 8:27 AM
Quote Originally Posted by bob p View Post
I posted that Lidar image a couple of posts earlier. It wasn't like I had a bunch of Lidar images to choose from, there was only one on Wikipedia, this one:

[IMG]http://upload.wikimedia.org/wikipedia/commons/a/aa/Road_Map_using_LIDAR.jpg[/IMG]

If you look closely at the image it tells you several things about lidar. I'm surprised that nobody has commented on the problems in the image -- specifically why the oncoming traffic is not being adequately resolved.
I see a large shadow to the left of the hazy oncoming object. That, combined with the bright reflections off the overhead wires, suggests to me that this image is actually a composite Lidar-and-photo image. I originally thought the oncoming object was a bike, but the shadow looks too big.
Is a long-exposure photo frame confusing what we see here as a blur?

edit: looking at it again, I see the laser 'sampling' scans on the nearby objects (tree, guardrails, road surface) but see none farther out, just short of the bike. Is that the extent of the lidar range? Or do the individual scans resolve into a complete image at that distance?
 
Enzo 3/27/2018 9:46 AM
I know you see it that way, but implicit in that is that the failure here was systematic, rather than case specific.

What I mean is we often have people write in here with some problem with their amp, and they assume right off that it is a design flaw or some other systematic failure, rather than a simple breakdown of that particular amp. I don't think it fair to these cars to decide the problem here was a problem of concept. I think there was some particular problem with this particular car.

The engineers aren't crowing how great their systems are, they are just telling us they are good enough for beta testing. I see no evidence their enthusiasm is blind. I have seen the patterns the system uses to create frames of reference for things it detects. They are extensive, and they fan out to either side of the vehicle quite a ways, not to mention the overall environment of objects it sees.

No one claims systems are perfect yet. We can look at airline travel as a model for safety, they have earned it. But within my lifetime I recall the BOAC deHavilland Comet planes falling from the sky, and a later model Lockheed Electra losing a few planes to structural failure. These occurred during commercial use, not the testing phase. And of course the SST revealed a certain vulnerability beyond financial viability.
 
bob p 3/27/2018 10:58 AM
Nvidia just suspended their SDC program.

http://devblogs.nvidia.com/deep-lear...-driving-cars/
 
bob p 3/27/2018 11:06 AM
the State of Arizona suspended Uber's SDC program. Following the accident, Uber had voluntarily suspended the program. Now the State is suspending it.

Arizona suspends Uber's self-driving car tests after fatal crash - Mar. 26, 2018

Google/Waymo is still being allowed to test in AZ.

Last week the City of Boston suspended NuTonomy SDC testing in Boston.
http://www.thedrive.com/tech/19493/n...tal-uber-crash

Toyota suspended their SDC program.
http://www.thedrive.com/news/19442/t...lving-uber-car

Ford, GM, BMW and Hyundai are still going forward.
 
bob p 3/27/2018 12:12 PM
Elaine Herzberg was killed by the Uber SDC on March 18, 2018.

As some of you may know there was supposed to have been a vote on a federal bill (AV START Act, S. 1885) to relax the legal constraints related to SDC regulation. The Bill, if passed, would have exempted SDC from the DOT regulations / Federal Motor Vehicle Safety Standards (FMVSS) that are required for passenger cars sold to the public. The language of the Bill doesn't exempt SDC that are going to be used for R&D purposes, it specifically exempts ALL SDC from those pesky DOT regulations relating to driver and passenger safety. The Bill would have exempted SDC being sold to customers, not SDC being used in an experimental development, from the FMVSS. This bill was clearly penned by lobbyists for the SDC industry, who wanted all of the little "nuisance" automotive safety regulations that have been developed over a span of many decades, to be set aside.

The Uber-Herzberg death put an end to that, at least for now.

What's interesting is that the Advocates for Highway & Auto Safety wrote a letter to senators Schumer and McConnell on March 5th -- two weeks before Herzberg's death -- expressing many concerns about the safety defects in SDC, with one of their main concerns being that SDC are not able to identify bicyclists.

One of their main points was the same point that I had made earlier: that while every State's BMV has a policy of restricting the issuance of an operator's license to someone who can pass a minimal vision screening test, there exists no requirement for testing object recognition by SDC navigation systems.

As you might expect, the proposed Senate Bill included language to prohibit States from passing their own regulations, such as object recognition testing.

The letter:
Letter to Senate Leaders on Driverless Car Bill | Advocates for Highway and Auto Safety

A Story in AutoNews:
Senate leaders urged to ignore 'bogus urgency,' fix flaws in autonomous car bill

It was well known that SDC technology could not properly identify bicyclists prior to the Herzberg killing. Considering that the Advocates for Highway & Auto Safety were aware of the problem before the Herzberg death, it's reasonable to consider that the SDC car developers knew about it too.

Well, the way I see it there are only 2 options:

1) they never thought about Problem X, or
2) they did think about it but instead of stopping production they said, "fuck it."

Either way, lady got killed due to their fuckup.
 
bob p 3/27/2018 12:24 PM
The Bill as Introduced:

http://www.gpo.gov/fdsys/pkg/BILLS-1...115s1885is.pdf

AMERICAN VISION FOR SAFER TRANSPORTATION
THROUGH ADVANCEMENT OF REVOLUTIONARY
TECHNOLOGIES ACT

the Report of the Committee on Commerce, Science and Transportation:
http://www.gpo.gov/fdsys/pkg/CRPT-11...115srpt187.pdf
 
bob p 3/27/2018 12:46 PM
Quote Originally Posted by eschertron View Post
[IMG]http://upload.wikimedia.org/wikipedia/commons/a/aa/Road_Map_using_LIDAR.jpg[/IMG]
I see a large shadow to the left of the hazy oncoming object. That, combined with the bright reflections off the overhead wires, suggests to me that this image is actually a composite Lidar-and-photo image. I originally thought the oncoming object was a bike, but the shadow looks too big.
Is a long-exposure photo frame confusing what we see here as a blur?
Wikipedia said it was a Lidar image, not a composite Lidar-photo image.

I agree, my impression is that the oncoming vehicle is a person riding a bike.

Looking at the shadows suggests that the light source that is providing backlighting to the oncoming "motorist" is low on the horizon. That's what it takes for a small object to cast a long shadow. It might also explain why the tree branches in the distance are brighter than the rest of the scene. What is clear from the shadows in this image is that the primary light source illuminating the scene is located off in the distance, to the right.

As Enzo pointed out in a previous post, the lasers used in Lidar scanning have to be eye-safe. As such they are mandated to be low output Class 1 devices, such as the Velodyne rotating unit used in the Ubers.

You have to wonder whether low-output Lidar is at all effective in identifying an object that is silhouetted by a high intensity IR backlight such as the setting sun. If the bit depth of the lidar is insufficient, the reflected lidar signal could be insignificant compared to the intensity of the sun, rendering a backlighted bicyclist as a black silhouette. Unfortunately, black silhouettes look like empty space, unless you're a person with a brain. It's unfortunate, but the Velodyne Lidar technical data sheet's specifications conspicuously avoid saying anything about the dynamic range of the sensor.
 
eschertron 3/27/2018 1:48 PM
Quote Originally Posted by bob p View Post
Wikipedia said it was a Lidar image, not a composite Lidar-photo image.

I agree, my impression is that the oncoming vehicle is a person riding a bike.

Looking at the shadows suggests that the light source that is providing backlighting to the oncoming "motorist" is low on the horizon. That's what it takes for a small object to cast a long shadow. It might also explain why the tree branches in the distance are brighter than the rest of the scene. What is clear from the shadows in this image is that the primary light source illuminating the scene is located off in the distance, to the right.

As Enzo pointed out in a previous post, the lasers used in Lidar scanning have to be eye-safe. As such they are mandated to be low output Class 1 devices, such as the Velodyne rotating unit used in the Ubers.

You have to wonder whether low-output Lidar is at all effective in identifying an object that is silhouetted by a high intensity IR backlight such as the setting sun. If the bit depth of the lidar is insufficient, the reflected lidar signal could be insignificant compared to the intensity of the sun, rendering a backlighted bicyclist as a black silhouette. Unfortunately, black silhouettes look like empty space, unless you're a person with a brain. It's unfortunate, but the Velodyne Lidar technical data sheet's specifications conspicuously avoid saying anything about the dynamic range of the sensor.
Wikipedia said it was a Lidar image
I'm guessing an error of ommission. I don't know the lidar technology, but the 'true RGB' values for the yellow lines and the green grass (on the LH bank) suggest strongly of a composite image, unless the lidar receiver records broadband light. In which case there's a whole lot of processing power required to separate the laser reflections from the reflected incidental light. Not what I'd expect. But I have been wrong before.
 
bob p 3/28/2018 8:35 AM
I'm not sure that a whole lot of processing power is needed to sort out IR light vs. visible light. Cameras use passive detachable filters to selectively block either visible or IR light. Depending upon the type of filter chosen, they can selectively block some visible bands while letting others pass. Some will even produce color changes, which is known as a "false coloration" type of IR filter.

In the world of analog electronics we use bandpass filters to select by frequency, and in the world of DSP it's simple to just ignore specific frequency bins.

It would be nice to know what's actually going on in that photo. Unfortunately Wikipedia doesn't provide as much detail as we'd like.
 
Enzo 3/28/2018 1:41 PM
The photo above was a lidar image, so it is reflected light from the laser on the car. The receptor would only then need to be sensitive to the wavelength of the laser. That is different from camera imaging, where available light is the light source. In practice, the receptor likely is sensitive to a wider range of wavelengths than just the one. SO other light sources within its band would appear as light just as reflected laser.

I claim no special knowledge, but IR includes information that visible does not. Including the thermal you have discussed. What is black to our eyes may or may not be "black" to IR. So having the vision cameras seeing down into IR can only be a plus, in my view. (pun intended)

I don't know that the system needs to sort IR from visible. I don't think having visible will confuse something just paying attention to IR and vice versa. But maybe.

here is my guess at the photo:
it appears like a night shot to our eye, because of all the dark areas, but I tend to think it is a daylight image. That explains the shadow to the side of the oncoming whatsis. Our laser would have put a shadow behind the whatsis. The bright white off in the distance and up in the trees, along with the bright return over to the right in the distance furthers me thinking it is sunlit and getting into our sensor. That stuff off to the right appears to show THROUGH foreground shrubbery, meaning it is not reflected lidar laser.

I also would not be surprised if this is a composite image. You can see the sweep lines from the laser, most easily on the near pavement and the tree trunk leaning in from the right edge. But not on everything.
 
bob p 3/28/2018 3:23 PM
The data sheet for the Velodyne sensor listed the frequency of the laser emitter, but mentioned nothing about the frequency response of the sensor.

there's another accident that happened in California on March 23rd or thereabouts, with a Tesla driving at cruising speed into a concrete lane divider on Hwy 101, bursting into flames, and killing it's occupant. The front end of the car was essentially shattered off of the passenger cage. There's no word out yet on whether this was a human-piloted vehicle or whether the driver had engaged the Tesla's "auto-pilot." What has been mentioned in the news is that the concrete divider appeared not to have the crash barricade in front of it, which I think was meant to mean the yellow plastic drums full of water with reflectors on them. Looking at the video, it looks like the concrete divider had a steel guardrail tip, which may have been difficult for an optical sensor to recognize it, if one was engaged.

http://www.nbcbayarea.com/news/natio...477762763.html

The driver was killed, much of the Tesla X was destroyed in the fire, and Tesla employees were called to clean up the battery debris before the tow trucks hauled the wreckage away. Chances are that there won't be any evidence left to confirm whether the Tesla X was operated by a person or by auto-pilot. I have to wonder whether it was really a good idea to allow Tesla's accident response team to clean up the crash site.

Uber announced today that they won't be renewing their SDC testing permit in California.
 
Enzo 3/29/2018 7:09 PM
I have been in the position of defending the cars somewhat. Not because I believe in them, but I felt there were causes and reasons for the failure being proposed that were unfair, etc...


But I don't want to share the road with them. Put a giant strobe light on top so I know when one is around. But I do have serious concerns. I know the car vision can read and interpret road signs. It reads speed limits, it knows no left turn, do not enter, one way street, and so on. I know they have made great effort - recent mishap aside - in recognizing humans standing around. But I just read of a concern that had not occurred to me. A road construction worker is standing in the lane gesturing to the car to drive around via the oncoming lane. This happens all the time around here during construction season. Often a flag man at each end. The car will see the person and stop, but how will it interpret the guy waving to the side? I can think of various scenarios wherein humans are making meaningful gestures. A twig or small branch on the street would likely be ignored by the car, but what if it is a live power wire. Looks like a tree part or a rope. And what if a human is waving to point it out?
 
Steve A. 3/30/2018 10:36 AM
Quote Originally Posted by bob p View Post
If you want to restrict the speed discussion to the issue of speed being 38 vs 35 that is still a material consideration, though it relates to liability, more than to outcome.

When you're speeding, you're breaking the law. When it comes to litigating this mess of an accident, we can count on the plaintiffs' lawyers making hay out of the fact that the car's programming explicitly approved of it speeding by allowing the car to exceed the speed limit. Even worse is that the Uber employee hired as a "safety driver" to oversee the safe operation of the vehicle explicitly condoned the condition of speeding.

Those two facts change the nature of the accident from one of being a simple "accident" / equipment failure to one of being a purposeful design to allow negligent operation of the vehicle, intentionally violating the law and placing the public at risk by doing so.

Ultimately, that will effect the amount of money that comes across in a judgement. In a case where the negligence was willful and wanton the courts typically award treble damages.
Bob, when I posted this picture the other day I had no idea how close it was to the scene of the accident. The maps I posted show that it was very close (the car at the top of the picture is within yards of the accident- maybe 10 yards.) So even if the police thought that the speed limit was supposed to be 35mph it certainly looks like the closest sign said 45mph. As I suggested the speed limit immediately before the sign may have been higher. Maybe not... I have no idea what the speed limit on the northbound bridge would be.

[img]http://music-electronics-forum.com/attachment.php?attachmentid=47782&d=1521862110[/img]

My point was that with a 45mph speed limit a pedestrian should have exercised a LOT more caution than if it had been 35 mph. And the maps I posted indicated no sharp turn which would have prevented the car from detecting the bicycle when it entered the #1 northbound lane. The reactions of the safety driver suggest that the car had not responded at all to the bicycle before it was too late. So I suspect some malfunction in the car.

Steve A.

P.S. We just had another self-driving car death here in the SF Bay Area....
 
bob p 3/30/2018 10:58 AM
Quote Originally Posted by Steve A. View Post
P.S. We just had another self-driving car death here in the SF Bay Area....
Would that have been the Wei Huang / Tesla X crash on the 101 in Mountain View? I linked the story two posts earlier, but I hadn't heard whether the Tesla was on Auto-Pilot or not. I would suspect that it was, as most humans would never drive head-on into a lane-dividing concrete barrier, but it appears that the only evidence that could prove that the Tesla was on auto-pilot was destroyed in the crash (or in the cleanup).

I'm with Chuck. I don't want to get into a Johnny Cab, and I think the people who do need to have their heads examined. They throw their lives away by getting into a Level 2 vehicle and treating it like it's Level 5.

Autonomous driving levels 0 to 5: Understanding the differences
https://www.techrepublic.com/article...e-differences/
 
Enzo 3/30/2018 2:11 PM
Re: 45mph limit sign.

When we brought it up recently, I went to google street or whatever that view is called, and ran back (against traffic so as to find where they came from) quite a ways to see what speed limit was before that 45 sign, and all I could see was the 45. If it had been 35, I didn't go back far enough.
 
DrGonz78 3/30/2018 2:35 PM
Quote Originally Posted by Steve A. View Post
So even if the police thought that the speed limit was supposed to be 35mph it certainly looks like the closest sign said 45mph. As I suggested the speed limit immediately before the sign may have been higher. Maybe not... I have no idea what the speed limit on the northbound bridge would be.
I have driven that exact stretch of road so many times before too. I was surprised to see a 45mph speed limit that you posted from Google maps. I would have guessed it to be 40mph max at that exact location, which means drive 45mph.

Uber had recently removed the Volvo collision avoidance system in the car too. Which may point some blame to Uber's collision avoidance system as a problem in the case.

Another thing is that the video I saw of the crash appears to be very dark on that stretch of roadway. I saw some youtube videos of people driving that stretch of roadway and it was much brighter contrast. Still those other videos could have been made when there was a fuller moon in the sky or something.

Family of the victim was paid by Uber and has already reached a settlement out of court.

Governor Ducey was responsible for bringing the driver-less Uber cars for testing in the Phoenix area. Details have emerged that he did not have a plan in action to keep Uber accountable while it was testing the vehicles on the roadways. Of course in California, Uber has to report all sorts of data to the state/city for review.
 
DrGonz78 3/30/2018 2:39 PM
Quote Originally Posted by Enzo View Post
Re: 45mph limit sign.

When we brought it up recently, I went to google street or whatever that view is called, and ran back (against traffic so as to find where they came from) quite a ways to see what speed limit was before that 45 sign, and all I could see was the 45. If it had been 35, I didn't go back far enough.
It definitely would have been 35mph or even 25mph going down mill ave. As you travel northbound on Mill Ave. and just after Rio Salado Pkwy the speed would definitely go up or be at 35mph at that point, then right after that you hit the bridge portion. Just surprised to see the 45mph sign suggesting to speed up before hitting the intersection as Curry rd. Pretty stupid to put up a 45mph sign at that point since people will think drive 50mph.

Edit: Just pulled up Google maps and as you travel Mill Ave. going northbound crossing Rio Salado there is a 30mph speed limit posted right away. As you hit the bridge it is 35mph. Exiting bridge is that picture of 45mph and then the accident occurs just after.
 
Steve A. 3/31/2018 12:29 AM
Quote Originally Posted by DrGonz78 View Post
It definitely would have been 35mph or even 25mph going down mill ave. As you travel northbound on Mill Ave. and just after Rio Salado Pkwy the speed would definitely go up or be at 35mph at that point, then right after that you hit the bridge portion. Just surprised to see the 45mph sign suggesting to speed up before hitting the intersection as Curry rd. Pretty stupid to put up a 45mph sign at that point since people will think drive 50mph.

Edit: Just pulled up Google maps and as you travel Mill Ave. going northbound crossing Rio Salado there is a 30mph speed limit posted right away. As you hit the bridge it is 35mph. Exiting bridge is that picture of 45mph and then the accident occurs just after.
Thanks for filling us in on the speed limits on the bridge. Yes, once you enter the residential area I think 25-35 mph would be more appropriate (you drive on those streets so I would take your word on it.) And yes someone really screwed up posting the 45 mph sign where N. Mill was going under the freeway. I think that it was supposed to be 35 mph as the police said in the initial reports but someone ordered or put up the wrong sign. Someone here mentioned that the Uber vehicles do read the speed limit signs.

And yes, as you mentioned Arizona is like the Wild West for self driving vehicles without all of those damned regulations imposed in California.

Steve A.

P.S. Many people have pointed out that the dashboard camera video showed the street to be darker than it actually was (perhaps a compromise to keep detail in videos taken in bright daylight...?)
 
DrGonz78 3/31/2018 1:27 AM
Quote Originally Posted by Steve A. View Post
And yes someone really screwed up posting the 45 mph sign where N. Mill was going under the freeway. I think that it was supposed to be 35 mph as the police said in the initial reports but someone ordered or put up the wrong sign.
The 45mph sign can be seen there going all the way back to, at least, 2008 on the Google maps page. Thinking back to all the times I drove over that bridge I never sped up to 45mph while approaching the next intersection. What I recall is that 9 out of 10 times you hit the red light at Curry rd. or your turning left or right, but some of those times were in heavier traffic times.

Essentially though the uber car must have been speeding up a bit right at that very moment of impact. Talking with my mom today we both agreed that every time we saw the uber cars that they seemed to be going a bit fast. Not really speeding but you could just sense quickness about the driving style.
 
Steve A. 3/31/2018 1:50 AM
Quote Originally Posted by DrGonz78 View Post
The 45mph sign can be seen there going all the way back to, at least, 2008 on the Google maps page. Thinking back to all the times I drove over that bridge I never sped up to 45mph while approaching the next intersection. What I recall is that 9 out of 10 times you hit the red light at Curry rd. or your turning left or right, but some of those times were in heavier traffic times.

Essentially though the uber car must have been speeding up a bit right at that very moment of impact. Talking with my mom today we both agreed that every time we saw the uber cars that they seemed to be going a bit fast. Not really speeding but you could just sense quickness about the driving style.
I found an article about how Uber disabled the crash avoidance system built in to the Volvo SUV so that they could test their own system. Some interesting pictures from the article...

Uber 'disabled Volvo SUV's safety system' before car killed pedestrian | Daily Mail Online

[img]http://music-electronics-forum.com/attachment.php?attachmentid=47919&amp;d=1522482014[/img]


[img]http://music-electronics-forum.com/attachment.php?attachmentid=47920&amp;d=1522482014[/img]

Steve A.

[ATTACH=CONFIG]47919[/ATTACH]

[ATTACH=CONFIG]47920[/ATTACH]
 
Steve A. 3/31/2018 2:06 AM
What I find puzzling is why Uber is even looking into self-driving vehicles as their whole business plan is based on the idea of using their drivers' vehicles and insurance to reduce their costs. They were making money as the middle man between people needing rides and drivers looking to make money driving people around. And really putting the hurt on taxi companies and their drivers. Taxi medallions in SF cost $250,000 with what used to be a 10 year waiting list to buy one. Now people are trying to sell their medallions... good luck!

https://www.kqed.org/news/10693923/f...s-now-a-burden

Steve A.
 
Enzo 3/31/2018 4:34 AM
Most personal car insurance excepts commercial use. Very few people act on that. But if you are Ubering in your car, or delivering pizzas in it, and bang into someone, your insurance might not cover it for that reason. And if an Uber car causes someone's loss, the Uber name will be on the suit along with the driver. And when the driver is found not to be insured for that use, Uber is right in the crosshairs.

Uber's model is making money. I suspect they are looking into not having to contract drivers. Their own cars would be available 24/7 not at the leisure of contractors. They would not have to pay commissions. They could chose to self-insure. I rented a car for a day a couple years ago from Enterprise. They told me they self-insure. IF the insurance on a small car is $2000 a year, and they have 10,000 cars., they are not probably going to pay out that much is lost liability suits.
 
J M Fahey 3/31/2018 10:50 AM
Self insurance can be very good business.
A friend of mine designed a luggage insurance scheme for the Ship company joining Argentina and Uruguay across the Rio de la Plata, one of the (if not *the*) widest river in the World, going from 50km to 300km wide (justly called the "sweet water Sea" for that).
They offered a no-questions-asked U$1000 payment for lost/stolen luggage for a meager $1 to $5 (depending on size and weight) insurance ticket.

I asked them: "how many luggage pieces do you transport a year?" : "over one Million"
" how many are lost/stolen?" ..... "not a single piece!!!!! "

They play it much safer than Uber though: luggage is labelled and registered with a bar code on boarding, and stored in a special section in the hull with a guarded single in/out door, which is then locked, and unlocked and opened only on arrival.
 
bob p 3/31/2018 1:12 PM
Uber Plans to Eliminate an Entire Class of Low-Paying Jobs: Taxi Drivers
My neighbor backed into my car and cracked it's bumper. When I called my insurance company to file a claim, they asked several scripted questions. The first one was whether I've ever used the car in a ride-sharing service. Of course, they wanted to invalidate my policy to avoid paying the comprehensive claim.

Uber doesn't aspire to be in the taxi business forever. Their long term plan is to use the 3rd party drivers and vehicles to develop brand recognition and replace them with robotic cars as soon as they can. I heard the CEO talk about that in interviews on the stock market TV channels prior to killing Elaine Herzberg.

It seems that everyone in the SDC industry is proceeding at a breakneck pace to be the first to market with the robotic taxi because there's huge money in it as a global business. Self-driving taxis export peoples jobs to nowhere. From a profit standpoint, it's better than sending a manufacturing job from the USA to Mexico and then to Southeast Asia, always in search of a lower wage employee to reduce manufacturing cost. With a robotic taxi they don't even have to pay a third world wage, because they've completely eliminated the cost of the employee driver.

This SDC taxi idea does not bode well for the American economy, as it will completely eliminate an entire class of service industry jobs -- taxi driving. Another thing that bothers me about Uber -- something I haven't mentioned yet -- is that they falsely portray themselves as a company that allows people who need money to moonlight as drivers, and "get their hustle on." (Company slogan) That's a false portrayal, as Uber has no intent of employing drivers over the long term. They're just using those people for the short term, until they can replace them in their long term plan of autonomous taxis. That's does not bode well for the American economy, because we've already exported the best manufacturing jobs, and now we've got companies that are working to eliminate some of the worst paying jobs in the service industry as well. If this keeps up, poor people won't even be able to drive a taxi to obtain a sustenance wage.
 
bob p 3/31/2018 1:39 PM
These companies that are working on the development and deployment of SDC aren't doing it to improve safety or to benefit mankind. Those are the cover stories that they use to sell the idea to naive and gullible people. The reality is that these companies are fighting one another to be the first to make billions of dollars by eliminating the cost of professional human drivers on a global scale, and they're willing to cut all sorts of corners to be the company that gets there first.

It's unfortunate that the lure of big money has rendered many of these SDC companies to make irresponsible decisions in the quest to be first to market. All too often they throw safety concerns out the window in favor of rapid advancement in their SDC technology. As noted in the Senate Bill that was linked previously, they've gone so far as to press for legislation to exempt their vehicles from all of the Federal Motor Vehicle Safety Standards that apply to cars.

The FMVSS came into existence over a period of decades as the result of coping with human deaths on highways. The government realized that this was a problem and put regulations in place to improve vehicle safety. The government took these steps because it was the right thing to do, and Detroit hated it. It ended up costing them billions. Now the SDC companies want to be exempted from the FMVSS because the cost of compliance with FMVSS is huge -- it costs Detroit billions of dollars per year, and the SDC companies would rather not have that expense. They want to be exempted from the FMVSS that have evolved over decades so that they will have a competitive cost advantage over building traditional cars, enough of an advantage to render the traditional car non-competitive and economically obsolete.

There is so much money to be made in this scenario -- billions upon billions of dollars -- that they can afford to kill people along the way and just write checks as the cost of doing business. The reality is that as much as people value other people, corporations don't value people, they value money. When corporate profits are high enough that the cost of killing people is small on the balance sheet, it's easy to view the occasional death as the cost of doing business. It's like fuel tanks that burst into flames in an accident, an ignition switch related death, or an airline losing your suitcase. In many of these situations the corporations find it less expensive to pay out losses than to fix the problem. Yes, eventually public outrage catches up with cost-cutting, and eventually the government would start to enforce Federal SDC Safety Standards if there were none, but the SDC industry would profit by delaying those expenses as long as possible to increase their up-front profits. It's Business 101.

Choosing to disable the Volvo anti-collision system is a good example if irresponsible/negligent behavior on the part of Uber. The car came from Volvo with an accident avoidance system, and they turned it off. WTF?!? If Uber wanted to test their own accident avoidance system, why couldn't they just leave both systems in place and have the SDC computer make note of which one activated first? I think the answer is that it was cheaper to disable the Volvo/Aptive system than it was to hire engineers to reverse engineer the Aptive system to make it compatible with the Uber system.

The whole idea of turning off an accident avoidance system that Aptive had spent untold millions developing for Volvo, and replacing it with their own unproven system, supervised by a felon with a very bad driving record is just reprehensible. This is the kind of thing that can make a company like Uber go belly up.

Here is a video that gives some information on the driving record of Rafaela Vasquez. Apparently Vasquez has multiple convictions for speeding, running red lights, driving with an expired license, driving with an expired registration, driving without insurance, and felony conviction for robbery which led to 4 years in jail. When asked why Uber thought that this person was qualified to act as a "safety driver", a spokesman replied, "Everyone deserves a fair chance." I don't agree. Setting aside her status as an ex-con, I don't think it's a good idea to hire someone who has such a poor driving record to work as a "safety driver."

Vasquez also told police that she attempted to apply the brakes before hitting Herzberg. That appears to have been a lie, as the truth has come out in the news reports and videos that clearly show that neither the SDC nor it's driver made any attempt to brake before hitting Herzberg. The interior dashcam shows that Vasquez was unaware of Herzberg until the Uber had hit her.

http://video.dailymail.co.uk/video/m...7410812501.mp4
 
J M Fahey 3/31/2018 1:52 PM
I think the concept of self driving cars **as is** is DEEPLY flawed.
Having to share roads with human driven (ugh!!!) cars, which are absolutely unpredictable, and actual P.E.O.P.L.E (same thing or worse) puts all the load on car´s sensors and processing power to solve, real time, that zillion variable constantly changing equation. Impossible task.

On the contrary, with no humans involved at all ... even as "victims", the problem complexity drops by several degrees.

Accidents happen because people put themselves in the path of robotic cars, or are hit because said cars have to manouver outside of their own, optimized path to avoid hitting John, so they hit Peter instead.
But ... what was John doing in the Robocar path in the first place.

Some still have a very 30's to 50's SF vision as to what Robots are or should be, a very antropomorphic ("human shaped") one.
[IMG]https://digital.hammacher.com/Items/10921/10921A_1000x1000.jpg[/IMG]

which is childish.

This is a real robot.
To be more precise, a dozen of them, operating autonomously, mining 24/7/365 if needed.

There *is* a human supervisor, mainly for safety and to decide unexpected problems, such as a machine breaking down.
In another CAT video they mentioned that each truck or digger demanded "4.5 Humans on average" per day, so they are saving 26 salaries ... not bad at all.
 
Chuck H 3/31/2018 2:58 PM
Decades ago I always imagined SDC's would be like trains, sort of. There would be no possibility for human error because there would be no humans involved. The SDC's would recognize each other and sort of link up in a very efficient line. When a car needed to be added or subtracted from the line a program would handle it with an efficiency unknown by anyone who has waited at a metered freeway ramp. This would only apply to freeway/highway driving. On city streets your on your own. So getting on the freeway would be sort of like boarding a train. The idea that we skipped this option completely and are now testing what amounts to autonomous taxi cabs on city streets is straight up clown shit crazy. It will NEVER work the way they are trying to do it and they (the ubiquitous "they") should stop trying. WRT mixing human drivers and robot drivers, how did the human race breed people that are smart enough to design and build a car that can ALMOST drive itself, but too dumb to know it CAN'T work?
 
nickb 3/31/2018 3:17 PM
Anyone are to comment on the data that shows that the accident rate per mile of SDCs is less than HCCs? The reports than I found were a little dated so one would hope that things have improved.

OTOH we all know about software upgrades...

I too imagined the system to work with road senors and with vehicles communicating with one another. I suspect what is currently proposed is merely a stepping stone along that path that will work with the existing infrastructure. It's the economics of the situation.
 
J M Fahey 3/31/2018 5:19 PM
Quote Originally Posted by Chuck H View Post
Decades ago I always imagined SDC's would be like trains, sort of. There would be no possibility for human error because there would be no humans involved. The SDC's would recognize each other and sort of link up in a very efficient line. When a car needed to be added or subtracted from the line a program would handle it with an efficiency unknown by anyone who has waited at a metered freeway ramp. This would only apply to freeway/highway driving. On city streets your on your own. So getting on the freeway would be sort of like boarding a train. The idea that we skipped this option completely and are now testing what amounts to autonomous taxi cabs on city streets is straight up clown shit crazy. It will NEVER work the way they are trying to do it and they (the ubiquitous "they") should stop trying. WRT mixing human drivers and robot drivers, how did the human race breed people that are smart enough to design and build a car that can ALMOST drive itself, but too dumb to know it CAN'T work?
That´s *exactly* the point.
I was going to mention trains but suspected everybody would BOOOO!!!! me as being unrelated.

In fact you can *easily* do this with standard gasoline and rubber tires on asphalt cars, in roads and even in streets , if (easy already proven technology) you bury a guidance wire along roads which will be used for this, (probably radiating very low power RF) including properly spaced little antennas before corners or intersections so car slows down to be able to take them (detecting 2 wires crossing at an intersection is too late, Mass and Inertia Laws still apply) .

Not even a buried wire is needed, a 4" wide white line painted on top and photosensors is good enough.
Hey!!!! one of my earliest Electronics projects was building a "white line following" Robotic Turtle, back in 67 or 68!!!!

Full Electronics and Mechanics consisted of a Battery, as single Photocell, 2 hand torch lamps, a Relay and 2 toy Electric Motors!!!!

Turn sound OFF, nerds have TERRIBLE musical taste.


What´s important is that these "weaker than a mosquito brain" cars work better than Uber´s best ... just because their 1 bit brains (literally: the only choice is between Right and Left and for that a single Comparator is enough) are asked the proper question .... while Uber´s are asked to divide by Zero (predicting what dumb Humans can do).

This is the modern version of what I built more than 50 years ago:
[IMG]http://rookieelectronics.com/wp-content/uploads/2012/08/Line-Following-Robot-using-Transistors.jpg[/IMG]

Of course, bright pink or green fluo lines should be painted on the surface to warn dumb cars and pedestrians AWAY!!!

And anybody venturing there anyway and causing an accident should be sued and jailed for splotching Robocars with internal fluids.
Not kidding, it´the only way this can work in the foreseeable future.
 
Enzo 3/31/2018 5:37 PM
No, white lines don't work, they are working on that very thing here in Michigan where the roads are covered in snow and ice large parts of the year. SNow and ice tend to be the same color as white stripes.

A wire in the road need not be RF radiating, it could be inductive, a loop on the car picking it up. I wouldn't worry about right angle cross streets, I think such a system would have built in location awareness. The car doesn't have to be led down to the corner to decide. The car would know it is about 100 yards from intersection ABC123, and will know where it is going ahead of time.

In railroads, where they cross, there will be tangent track, like exit ramps, between crossing tracks at the corners. They have to set the track switches. IN a dumb wire setup, you'd have to have such transition tracks on the corners, and the car would have to know ahead which way to go at the switch. So the car has to have positional awareness either way.
 
Chuck H 3/31/2018 6:17 PM
Part of the problem might be that these are what amounts to analog solutions in a digital world. No one will fund low tech projects because there's no kudos in it. Hell, you might as well just have them on a track like tour cars at the amusement park. Nope Those solutions are stupid! How much smarter to try and teach a computer to drive using limited sensory perception
 
Enzo 3/31/2018 11:08 PM
Downtown Detroit has a People Mover. it is an elevated track and is essentially a single small subway car. Well, several of them spaced out. Of course the route is set, and hard to change, due to the track. A guide wire setup in the street could easily be changed if needed, and little autonomous cars could perform a similar service. The route is still limited to the wire course. One could even set up lanes for it that exclude other traffic

West Virginia Univ has a similar rail car deal, very much like Detroit's mover. In both cases, the cars are autonomous. They don't ride on the street, but a street runner far short of turning cars loose on the roads could be done.
 
J M Fahey 4/1/2018 12:19 AM
A rail isb the simplest solution, but is fixed.
Embedded wires form a grid covering all the city, it´s very simple to program a car to go, say, 6 streets along #438, turn right for 27 more streets at #217 (odd might go N-S , even might go E-W) , then turn left at #396 for 3 more streets and get into the garage door which will be waiting for it about half block, of course radiating its own callsign.

Extremely limiting you say?

Not at all, even a human driving a car *must* follow established streets, he can´t cut through a city block full of buildings, so co0mplexity/flexibility is the same.

A rail forces car to a fixed path, crossings where it may turn are somewhat complex, while a robocar can follow wires which actually do not *force* it (like a rail does) but merely show the way.

With the exact same gridset, 100 cars can follow 100 different paths, following what was programmed in them.

I´m always trying to find the simplest/dumbest solution for everything, but if the next level of complexity is acceptable, GPS precision could be increase a hundredfold (I think current resolution is 100ft/30 meters or so) and then we can dispense with wires or optical signs, cars could know where they are even in the middle of an unexplored forest or the middle of the desert.
 
Enzo 4/1/2018 12:40 AM
A wire follower is limiting in that the streets must have the wires, it cannot detour around the block if there is a blockage.

One problem here is the roads are in such terrible condition. I am talking specifically Michigan. They cannot come up with money to repave the roads or even fix the seasonal potholes, let alone bury guide cable infrastructure. Private investors like Uber, no one cares about, but ask the people to tax themselves to fix the roads or add something remotely resembling "mass transit" and they scatter and hide.
 
Steve A. 4/1/2018 4:06 AM
Quote Originally Posted by Enzo View Post
Most personal car insurance excepts commercial use. Very few people act on that. But if you are Ubering in your car, or delivering pizzas in it, and bang into someone, your insurance might not cover it for that reason. And if an Uber car causes someone's loss, the Uber name will be on the suit along with the driver. And when the driver is found not to be insured for that use, Uber is right in the crosshairs.

Uber's model is making money. I suspect they are looking into not having to contract drivers. Their own cars would be available 24/7 not at the leisure of contractors. They would not have to pay commissions. They could chose to self-insure. I rented a car for a day a couple years ago from Enterprise. They told me they self-insure. IF the insurance on a small car is $2000 a year, and they have 10,000 cars., they are not probably going to pay out that much is lost liability suits.
Enzo, with the new "gig economy" companies like Airbnb try to make the maximum revenue on the smallest investment possible. Rather than borrowing money to buy motels and hotels they hook up people who own or lease housing to people looking for short term rentals.
I suspect that Uber is not looking to replace their existing drivers and their vehicles but to supplement them as they move in a new direction, perhaps aiming at corporate customers. So I do wonder what they are cooking up.

Steve A.

EDIT: I guess Uber could be hoping for *investors* to come up with the money to pay for a self-driving fleet to stay true to their cheap-ass business plan.

It looks like Uber does offer basic insurance for their "driver-partners" for whenever they are logged on to the app, with more comprehensive insurance from the time they accept an assignment to when it has been completed:

https://www.uber.com/drive/insurance/
 
Enzo 4/1/2018 4:53 AM
Investors? Sure, they call them stock holders.

I think Uber is doing whatever it can to get established. Like a food truck that parks in the lot of a gas station. He can make a weekly sum for himself there, but he doesn't turn into the next Subway chain by putting up food trucks on other peoples' parking lots. I think Uber will ditch the part time, whenever I just feel like it, driver pool as quick as they can, with a steady profit maker.

Today is Easter. Automatic cars don't know the difference. But do we think the Uber driver pool is as large today as it would be tomorrow? Or on Xmas? SUperbowl day? Obviously not all Uber drivers have the same interests, but last night I watched the NCAA basketball games, and if I were an Uber driver, I would not have been available.


AirBnB is fighting in these parts. Several municipalities have filed suit claiming it is just a way to avoid paying hotel taxes. I think AirBnB is a good example of scale. Someone got the idea of renting out their home. That spread, so someone came up with AirBnB, which is just a clearinghouse for people looking to rent and owners willing to do so. AirBnB doesn't make money from people staying, it makes money from renters for the booking. Their only investment is a computer system. Uber is the same...so far. I would imagine the real business model of AirBnB would be to grow it as fast as they can until someone like Priceline or Trivago buys it from them. I think Uber though is positioning itself to become a new large transportation company.

If you drove a cab, you could be STeve's Taxi Inc. No one knows you, but you'd get rides. Or you COULD pay a franchise fee to drive under the name Yellow Cab in your town. people see that as an established cab company. WHen I rent a car, it is usually from Enterprise, but I have gone to Rent-a-wreck before, just some local garage. Everyone knows the name Uber, the word has become generic. Uber is franchising the name, selling the booking system. I suspect like SUbway, they will continue to have drivers as franchisees, and company owned operations as well. But what if they require franchisees to have Uber cars?
 
Steve A. 4/3/2018 3:34 AM
Quote Originally Posted by Enzo View Post
Investors? Sure, they call them stock holders.
Uber is a private not a public company... what I meant was a large influx of cash from new investors to finance the planned fleet of self-driving taxis.

In any case there was a good article in the Sunday which I will post explaining how it is the Waymo division of Google that is investing heavily in self-driving taxis and Uber is trying to catch up with them since their current business model of driver/partners will be going down the tubes, at least here in the U.S., in 10 years, maybe?

Just my opinion which is always subject to change... I call'em as I see'em and that can change as new data comes out. (I was going to say "new facts" but "new data" does not suggest a veracity in the stories that will emerge over time.)

Steve A.
 
Enzo 4/3/2018 4:04 AM
What do you call investors? How can they gain from investing unless they own stock? Or some other promissory vehicle? There is no need for a company to be publicly traded for there to be stock in it. There is absolutely nothing new about people giving a company money in hopes of return later. Private bonds?
 
Steve A. 4/4/2018 2:08 AM
An interesting article of which I have attached a PDF file to supplement the brief excerpt quoted below...

[ATTACH]47978[/ATTACH]

Waymo starts to eclipse Uber in race to self-driving taxis
by Carolyn Said [San Francisco Chronicle 03/30/2018]


Uber barreled into autonomous driving out of fear that it could end up as the Myspace or Yahoo of ride-hailing, a company with early gargantuan success that stumbled as times changed.
• Waymo, the self-driving unit of Google parent Alphabet, has pursued its ambitions more cautiously, accumulating long years of research and testing before pursuing a plan to bring its technology to the public.
• Now, as Waymo scales up its self-driving taxi service, Uber’s fear could be coming to pass.
• As Uber continues to reel from a fatal self-driving accident in Arizona, Waymo has confidently pushed forward — landing a deal to build 20,000 self-driving luxury sport utility vehicles with Jaguar Land Rover on top of its plan for thousands of Chrysler hybrid minivans. Within two years, it plans to have thousands of fully autonomous taxis — with no backup drivers behind the wheel — on the roads, starting in Phoenix, where it is already giving test rides. The company predicts it will give 1 million robot-taxi rides a day by 2020.
• Waymo, the industry pioneer, logged millions of autonomous miles as it perfected self-driving technology. But over the years, engineers defected out of frustration that it was not finding commercial uses for the technology. Now, with former auto executive John Krafcik at the helm, Waymo appears ready to create a self-driving taxi service that could conceivably dominate that field, at least early on, the way Uber does now with human-driven cars...

https://www.sfchronicle.com/business...o-12794353.php
https://www.evernote.com/shard/s300/...4aafee13edd3f2
https://m.facebook.com/story.php?sto...&id=1358043705


Quote Originally Posted by Enzo View Post
What do you call investors?* How can they gain from investing unless they own stock?* Or some other promissory vehicle?* There is no need for a company to be publicly traded for there to be stock in it.* There is absolutely nothing new about people giving a company money in hopes of return later.* Private bonds?
If I were to invest a large amount of money in a private company like Uber to finance their expansion into a large fleet of self-driving vehicles I would insist on a percentage of the company rather than a fixed number of shares, the value of which would be diluted as more shares are issued. Yes, private companies will give shares to key employees as a reward for their contributions, etc., or sell them to venture capitalists, et al.

The investors I had in mind with very deep pockets would be big corporations who wanted in on the action and had to protect the interests of their own investors/shareholders (building the fleet that Uber has in mind to compete with Waymo, etc., is going to take some really big investments.)

Quote Originally Posted by Enzo
Today is Easter. Automatic cars don't know the difference. But do we think the Uber driver pool is as large today as it would be tomorrow? Or on Xmas? SUperbowl day? Obviously not all Uber drivers have the same interests, but last night I watched the NCAA basketball games, and if I were an Uber driver, I would not have been available.
Uber (like Airbnb) started in San Francisco so I have read a lot of articles about their drivers. The dedicated drivers would look forward to holidays or Super Bowl Sunday because of the opportunity to make more money with less competition.

Some of the Uber driver/partners might live in Fresno but drive up to SF every week where they might sleep in their cars as they work for, say, 4 days before returning home with a lot more money than they could earn driving all week in the Fresno area.

Of course the more casual drivers looking to pick up some "pin money" might opt out on the big days you mentioned. But the really dedicated drivers might have a mortgage to pay which they do not want to default on so they will scramble to make enough money to pay their bills.

There are a lot of people out here who lost good paying jobs during The Great Recession (probably in your area, too) and can make more money working their butt off in the "sharing economy" today than they could working, say, at Walmart for $11/hour and hoping to get enough hours (30 a week on average) to get benefits...

https://www.healthcare-now.org/blog/...ime-employees/

Steve A.

P.S. One of the big grocery stores out here (Safeway) is union but new employees (even ones who have worked here for several years) are scheduled no more than 26 hours a week which is the minimum amount they need to work to receive benefits. That works out well for Safeway because it gives the workers a big incentive to show up as scheduled (or talk to their supervisor ahead of time to be able to make up the lost hours later.) Part time workers can be very flaky about coming to work as scheduled...

BTW as of a few years ago there were 3 or 4 different tiers of pay and benefits for the union workers based on when they were first hired. The old timers might make $26/hr working full time with great benefits while the newer hires might make $10/hr and be scheduled for no more than 26 hrs a week. (My source of information was the old timer who restocked the freezers in the middle of the night when I would usually do my shopping.)

[ATTACH]47978[/ATTACH]
 
Enzo 4/4/2018 3:42 AM
That begs an interesting ( to me) question. I am sure there are guys looking forward to the holiday "overtime", but how many of them are there as opposed to the number who want it as time off? I mean would we have 15% wanting the extra time but 60% wanting it off? Or would we have 40% wanting the extra work and only 10% seeing it as a holiday? Would the number of guys stinking in their car for four days in need of a shower, be enough to compensate for the number of guys wanting to watch the game on TV at home?

Way back when, I was the guy who volunteered to be on call on holidays/weekends. I took the scheduled time, I swapped out with guys who didn't want the time. I was always looking for more work. Unfortunately, there was one of me, and four of the guys who looked for every opportunity to get OFF work. I was the guy who had unused sick days, and had use it or lose it holiday time every year. Those other guys were the type that every time they got enough hours for a sick day, they took it.

Point being, every job has the eager beavers, and every job has its slackers, and they don't always cover one for the other.


if I were to invest a large amount of money in a private company like Uber ...I would insist on a percentage of the company rather than a fixed number of shares...
A distinction that doesn't change my outlook. Call it shares, call it percentage of ownership, to me same same. I give you money, you own me equity.
 
Steve A. 4/4/2018 5:16 AM
Quote Originally Posted by Enzo View Post
That begs an interesting ( to me) question.** I am sure there are guys looking forward to the holiday "overtime", but how many of them are there as opposed to the number who want it as time off?** I mean would we have 15% wanting the extra time but 60% wanting it off?* Or would we have 40% wanting the extra work and only 10% seeing it as a holiday?* Would the number of guys stinking in their car for four days in need of a shower, be enough to compensate for the number of guys wanting to watch the game on TV at home?

Way back when, I was the guy who volunteered to be on call on holidays/weekends.* I took the scheduled time, I swapped out with guys who didn't want the time.* I was always looking for more work.* Unfortunately, there was one of me, and four of the guys who looked for every opportunity to get OFF work.* I was the guy who had unused sick days, and had use it or lose it holiday time every year.* Those other guys were the type that every time they got enough hours for a sick day, they took it.

Point being, every job has the eager beavers, and every job has its slackers, and they don't always cover one for the other.



A distinction that doesn't change my outlook.* Call it shares, call it percentage of ownership, to me same same.** I give you money, you own me equity.
Points taken. I guess what I am saying is that since their inception both Uber and Airbnb have been cheap bastards, making their money acting as the middle men (leechs?!?) between buyers and sellers with the software they designed being their only contribution. One of their biggest expenses has been lawyers in fighting regulations imposed on them by SF and California.

The whole "sharing economy" can offer some advantages to some people but can really rip off others, by calling the workers "contractors" to get around the normal responsibilities employers have to their employees.

You are comparing the jobs in today's sharing economy to the jobs we both experienced in the past but I find them to be very different. The driver/partners quite often lost good jobs during the Great Recession and are scrambling to make ends meet because they could not find a job offering the same pay and benefits that they had before.

Then again you might find younger workers who still get an allowance from their wealthy parents* (or trust fund) but would like some extra spending money so they will work only when it is convenient for them and they would certainly be found at home on Super Bowl Sunday.

But I would guess that most driver/partners would fall somewhere between those two extremes.

I just looked this up. It looks like the founders Garrett Camp and Travis Kalanick did very well with their initial investment of $200,000 in seed money. (I wonder how many subsequent investments they had to make in Uber besides reinvesting profits as necessary to keep the ball rolling.)

Financing:

The founders invested $200,000 in seed money upon conception in 2009. In 2010, Uber raised $1.25 million in additional funding. By the end of 2011, Uber had raised $44.5 million in funding. In 2013, Google Ventures invested $258 million in the company based on a $3.4 billion pre-money valuation. In December 2014, Chinese search engine Baidu made an investment in Uber of an undisclosed amount. The deal also involved connecting Uber with Baidu's mapping apps. In January 2015, Uber raised $1.6 billion in convertible debt. In May 2015, Uber revealed plans to raise between $1.5 billion and $2 billion in new funding, raising the value of the company to $50 billion or higher. In September 2015, Uber raised another $1.2 billion, led by another investment by Baidu.

In 2016, Toyota made an undisclosed investment in Uber and looked into leasing options, which could potentially aid Uber drivers financially, a move in response to the other partnerships between Toyota's and Uber's counterparts. In June 2016, with plans to expand in the Middle East, Uber received $3.5 billion from the Public Investment Fund of Saudi Arabia. In July that same year, Uber raised $1.15 billion in debt financing. In August, Uber agreed to sell its subsidiary company, Uber China, to China's leading taxi-hailing app Didi Chuxing. Didi also agreed to invest $1 billion into Uber Global.

In January 2018, the company raised $1.25 billion in cash from an investor group including SoftBank, Dragoneer Investment Group, Sequoia Capital. The financing valued the company at $68 billion.

In February 2018, Uber combined its operations in Russia, Armenia, Azerbaijan, Belarus, Georgia and Kazakhstan with those of Yandex.Taxi and invested $225 million in the venture.

In total, Uber has raised $22 billion from 18 rounds of venture capital and private equity investors.

https://en.m.wikipedia.org/wiki/Uber
From that same article is a chart showing Uber's profitability (or lack thereof!)

[img]http://music-electronics-forum.com/attachment.php?attachmentid=47979&amp;d=1522840323[/img]


Steve A.

[ATTACH=CONFIG]47979[/ATTACH]
 
Enzo 4/4/2018 7:20 AM
yeah well, Facebook is a middleman company and it didn't make a profit at first either.
 
bob p 4/4/2018 12:26 PM
I think the idea that Uber provides people with meaningful economic opportunity is fundamentally flawed. Driving a taxi cab isn't a good paying job, it never has been. It's always been one of the lowest paying jobs in the service industry. It's sad to think that driving someone else around in an Uber -- a minimally skilled job -- could be considered a decent job in America. That sort of thinking shows how much our idea of what comprises a good paying job has slid in the past 50 years.
 
bob p 4/4/2018 1:25 PM
SDC Promote Driver Distraction
Back to the most recent Tesla accident -- the March 23, 2018 crash of a Tesla Model X that killed Wei Huang on the 101 in Mountain View, California:

Tesla has confirmed that auto-pilot was in-use at the time that their Model X crashed into the concrete lane divider. Tesla also leaked out information that Huang had not had his hands on the steering wheel during the final 6 seconds that preceded the fatal impact. In response to these leaks, Tesla has been taking some heat for jumping ahead of the NTSB and leaking information about the accident, especially when the leaks seem to have the motive of placing blame on the dead driver.

I think it's significant that Wei Huang only had his hands off of the wheel for 6 seconds prior to the fatal impact. It's not as if he was one of those people who over rely on their SDC's auto-pilot systems for prolonged periods, he only had his hands off of the wheel for 6 seconds, and 6 seconds was long enough to kill him. This brings up what I think is a very interesting point:

Auto-Pilot and "On-Board Safety Systems" are dangerous because they foster driver inattention.

The manufacturers of SDC like to try to dodge accountability for the failures in their autonomous systems by claiming that the person who is in the driver's seat remains responsible for paying attention at all times, even though the manufacturers of SDC have the specific intent to profit by marketing a product whose appeal is to allow you to not have to pay attention while driving. Their premise seems absurd -- every bit as absurd as the Q-Tip Defense. You know what I mean by the Q-Tip defense, right? If you look at a box of Q-Tips, the instructions specifically tell you to never insert them into your ear, when everyone knows that's exactly why you bought them.

The problem with these "on-board safety systems" (as manufacturers like to market them) is that they foster driver inattention. They provide drivers with the illusion that the "safety system" allows you to stop paying attention, so that you can be distracted by diverting your attention elsewhere.

Car manufacturers have responded that they're going to change the way that these systems are implemented. Following the Josh Brown / Tesla S accident in Florida (the Tesla-Semi accident), Tesla admitted that not only did their visual recognition ssytem fail to notice the white semi-trailer against a brightly lit sky, their radar interpretation software intentionally ignored it, as that controller had been trained to "tune out" "large stationary objects" (like the semi trailer) to avoid "nuisance breaking" that would be caused by overhead road signs.

In order to avoid future liability associated with these sorts of defects related to the Brown death, Tesla's auto-pilot was designed to "require additional driver involvement" to prevent a crash. The Huang death shows us that Tesla's plan to require "additional driver involvement" was not enough. Huang remained involved in driving the Tesla, as he had his hands on the wheel (and presumably his eyes on the road) up until 6 seconds before impact. Clearly, "additional driver involvement" isn't going to be good enough.

Cadillac has announced that it will implement eye-tracking technology as part of it's "Super Cruise" system, though they haven't explained how the system will respond when a driver is not paying attention. It's doubtful that even the Cadillac eyeball tracker could respond to prevent a death caused by a few seconds of inattention while travelling at highway speeds. At 70 mph you will cover 100 feet of highway in 6 seconds, and it will take a lot more roadway than that to stop.

As the Josh Brown/Tesla accident has shown us, a chiming alarm isn't good enough. Brown's Tesla issued 7 different alarms, requesting him to take the wheel, before it killed him. The Brown and Huang deaths show us that people will ignore alarms, even if it leads to their death. Clearly alarms aren't going to be good enough either.

Nissan has designed it's "ProPilot" system to bring the vehicle to a full stop if the driver takes their hands off of the wheel for an extended time period. Great. Now we're going to have cars on the roadway making unannounced stops in moving traffic when their drivers stop paying attention. How can anyone possible think that stopping a car that's moving in traffic is going to be a good plan?

Driver inattention is a real problem. A recent insurance survey asked people what kinds of things they're likely to do in a SDC while the system's autopilot is engaged.

45% said they'd make a phone call.
42% said they'd eat.
27% said they'd read.
21% said they'd watch TV.
19% said they'd sleep.
7% said they'd have sex.
5% said they'd drink alcohol.
33% said the biggest advantage of an SDC would be that they could use it to get home safely while impaired by alcohol or drugs.

I think the auto industry seriously needs to re-think how they market these "on-board safety features" to customers. Most people don't use them as on-board safety features, most people use them as a way to get away with distracted driving, and that places us all at-risk. At this point it looks like they're decreasing safety on the roadways rather than increasing it.

http://www.thetruthaboutcars.com/201...omments-tesla/
http://www.thetruthaboutcars.com/201...-self-drivers/
 
bob p 4/4/2018 1:31 PM
Re: Uber and Herzberg:

Arizona, Suppliers Unite Against Uber SDC Program

Velodyne and Aptiv Defend Their Systems, Blame Uber for Herzberg Crash
 
Justin Thomas 4/4/2018 1:45 PM
Quote Originally Posted by bob p View Post
...
Driver inattention is a real problem. A recent insurance survey asked people what kinds of things they're likely to do in a SDC while the system's autopilot is engaged.

45% said they'd make a phone call.
42% said they'd eat.
27% said they'd read.
21% said they'd watch TV.
19% said they'd sleep.
7% said they'd have sex.
5% said they'd drink alcohol.
33% said the biggest advantage of an SDC would be that they could use it to get home safely while impaired by alcohol or drugs.

I think the auto industry seriously needs to re-think how they market these "on-board safety features" to customers. Most people don't use them as on-board safety features, most people use them as a way to get away with distracted driving, and that places us all at-risk. ...
We are all going to die...

Justin
 
eschertron 4/4/2018 2:15 PM
Quote Originally Posted by bob p View Post
5% said they'd drink alcohol.
33% said the biggest advantage of an SDC would be that they could use it to get home safely while impaired by alcohol or drugs.
Or they could take a taxi driven by a blind person... Mythbusters blind-driving
 
Steve A. 4/4/2018 3:32 PM
Quote Originally Posted by Enzo View Post
yeah well, Facebook is a middleman company and it didn't make a profit at first either.
So was Webvan which became the poster child for the dot com bust in the early 00's (the Naughty Aughties?)

10 big dot.com flops - Webvan.com (2) - CNNMoney.com

That chart was a last minute addition as I was thinking of all of the investors who have paid $22 billion so far... they would certainly be up sh*t creek if Uber went belly up!

I am biased because reading about Uber and Airbnb in the papers for so many years and their fights against local, state, federal and foreign regulations I have absolutely no respect for the companies and their business plans.

FWIW I think that Zuckerberg had no idea Facebook would grow so big when he started it to facilitate on-line communication between college students. Perhaps I am being naive but I do not think he started FB with the intention or hope of it making him one of the richest people in the world.

Steve A.
 
Enzo 4/4/2018 3:47 PM
I think the idea that Uber provides people with meaningful economic opportunity is fundamentally flawed. Driving a taxi cab isn't a good paying job, it never has been.
Neither is any low paying job. No one said Uber driving is a career. It is a part time wallet padder, that is all. Like when old people stand at the door of a Walmart greeting you. They trade some time for some dollars. I used to drive a cab in Washington DC. It wasn't glamorous or glorious, nor was the pay huge. But it was a decent summer job for a college kid. Like a waitress, tips were my real money. In fact, hasn't driver tips been a contention in the whole Uber/Lyft/etc deal? A professional going to the airport, a govt type going from Ag department over to Treasury, good tip. Old lady going 3/4 mile to grocery store - zero tip.

Tip income is why people stay waitresses all their lives, while no one stays a buss boy forever.


At 70 mph you will cover 100 feet of highway in 6 seconds, and it will take a lot more roadway than that to stop.
Show the math. If I recall anything from my physics classes it was 60mph is 88 feet per second. So 100 feet at 70mph should take more like only ONE second.

I think 70mph is roughly 103fps. So my figures gets me about 600 feet for six seconds of road time at highway speed. Two football fields.

Stopping time from 70 approximately 388 feet.

Here:
https://nacto.org/docs/usdg/vehicle_...time_upenn.pdf
 
Enzo 4/4/2018 3:58 PM
I agree Steve, Zuckerberg was as surprised as anyone. He just wanted to make it easy to get dates.


But I do think it is a current model to pick some small service, and grow it to the point one of the big boys buys it from you.
 
Steve A. 4/4/2018 4:19 PM
Quote Originally Posted by bob p View Post
I think the idea that Uber provides people with meaningful economic opportunity is fundamentally flawed.
I would say that applies to most jobs in the new "sharing economy" which is fine for some workers who are not depending on them to support a family — or a mortgage!

Driving a taxi cab isn't a good paying job, it never has been. It's always been one of the lowest paying jobs in the service industry.
Having worked in the service industry most of my life I must disagree — there were a lot of jobs much worse than that. Taxi cab drivers could dream of buying a coveted taxi medallion someday which they could sell for a good profit when they decided to retire... at least until Uber came along.

It's sad to think that driving someone else around in an Uber -- a minimally skilled job -- could be considered a decent job in America. That sort of thinking shows how much our idea of what comprises a good paying job has slid in the past 50 years.
I think that the biggest slide has been in the past 10 years, with much of it going back 20 years due to outsourcing jobs overseas (and 30 years, too, in response to Reaganomics.)

So what comprises a good paying job? I think it depends on the individual... as long as I was making enough money to pay my bills with enough left over to piss off on my son and my hobbies I was happy. As for saving money in IRAs and 401(k)s I considered smoking to be my retirement plan until I quit in 2004 after having a heart attack...

(Due to a very fortunate series of events I became a homeowner in 1984, this after having J.C. Penneys turn me over to a collection agency in 1977 for an $80 pong video machine which was halfway paid off. Hey, I had been fired from a good job at Pacific Stereo/CBS Retail Stores for 'insubordination' and bought $20 travellers check with my $2,000 "severance package"... the vested portion of company stock.

My monthly bill was $10 so I would go by Penney's and give them a $20 traveler's check every other month. Heck, I was pre-paying my next monthly payment but they did not see it that way at all. When my unemployment benefits ran out I had to abruptly move back home with my parents and missed one planned payment and BOOM! — a letter from the collection agency.

Other than that my credit record was nonexistent but my housemate in 1984 had a very good family friend in the real estate business who pulled strings to get us a mortgage when our landlord decided to sell after we lived there a year.)


Steve A.
 
bob p 4/4/2018 7:27 PM
Quote Originally Posted by Enzo View Post
Show the math. If I recall anything from my physics classes it was 60mph is 88 feet per second. So 100 feet at 70mph should take more like only ONE second.
Right. ONE second. That was a mistype on my part. I had 6 seconds on my brain because Huang let go of the wheel for 6 seconds. And that would result in about a 600-foot distance covered.

Stopping time from 70 approximately 388 feet.
Correcting my math error changes everything. If an average car uses 388 feet to stop from 70 mph, that means that the Tesla may have had plenty of time to stop if it had only recognized that it was about to drive into a barrier. It would likely have avoided the head-on collision, but then it just might have caused a rear-end collision if it had been programmed like a Nissan to come to a full stop on a busy interstate.

What seems odd is that the car steered right into the median barrier, while the traffic split into two lanes, one that went to the left of the barrier and one that went to the right. The Tesla AutoPilot driving right into the barrier in between the two lanes reminds me of the avionics problem of "controlled flight into terrain," where a jumbo jet's auto-pilot drives smack into the side of a mountain.

I'm thinking that refusing to get into the Johnny Cab won't be enough to save us. Now we have to worry about the SDC causing accidents that will involve us. Like Justin said, we're all gonna die.
 
Steve A. 4/5/2018 11:12 PM
Quote Originally Posted by Enzo View Post
I agree Steve, Zuckerberg was as surprised as anyone. He just wanted to make it easy to get dates.
Damn, I was going to say that but I decided I better not... ;-)

One really good thing about Uber is that has been great for people who don't have a car or a lot of money for taxis... for them it is easier to use than calling for a regular taxi, much faster and usually a lot cheaper. The county I live in has very limited public transportation and you either needed a car or a friend or relative to drive you around. When you add up all of the costs of owning a car for a lot of young people it is cheaper to use Uber to get around.

BTW there is a concierge service called GoGoGrandparent to allow people without smart phones and a data plan to book rides with Uber and Lyft. Once you set up an account you can call them and they will arrange everything for a nominal fee (18 cents a minute for the phone call.) They do recommend that the rider have a cell phone to make sure that the driver can find them.

https://gogograndparent.com/faq

I was going to sign up because I used to have a flip phone for my free Lifeline service but it died and they only have smart phones nowadays. I just now realized that I can now sign up for Movie Pass... $9.95 a month that allows you to watch one 2D movie a day at participating theaters but you need a connected smart phone or tablet to reserve a ticket while you are within 100 yards of the theater.

https://www.moviepass.com/

And I can sign up for the online Entertainment Book ($9.99/first year) I had tried the 99 cent one month trial thinking that I could print out the coupons at home but most of the good deals were active for only 10 minutes so you had to be close to the restaurant or venue.

https://shop.entertainment.com/pages/mobile-preview


Steve A.
 
Steve A. 4/5/2018 11:24 PM
Speaking of stopping distances when us oldsters were growing up we were supposed to allow one car length's distance behind the car ahead of us for every increment of 10 mph we were travelling at... 30 mph, 3 car lengths.

Of course cars today have much better brakes so I might leave 2 car lengths behind the car ahead of me if I am driving 40 mph. Or at least TRY to leave 2 car lengths ahead of me because some asshole will always try to pull into the space. *sigh*

One of the rules that has gotten me through life is this: when driving behind a vehicle that has much better brakes than you, give them PLENTY of room...


Steve A.
 
Gnobuddy 4/6/2018 3:27 PM
Quote Originally Posted by bob p View Post
...Tesla admitted that not only did their visual recognition ssytem fail to notice the white semi-trailer against a brightly lit sky, their radar interpretation software intentionally ignored it, as that controller had been trained to "tune out" "large stationary objects" (like the semi trailer) to avoid "nuisance breaking" that would be caused by overhead road signs.
As I mentioned before, estimates of the computational power of present-day personal computers and microcontrollers suggest they are only capable of approximately one hundred thousandth of the computational ability of the human brain. This is about the same computational ability as an insect. We're not even up to reptile-level smarts yet.

The problem Tesla outlined in Bob's quote above is a classic case of the extreme (insect-like) stupidity of present-day computers, and therefore the self-driving cars that depend on them. The AI system is too stupid to tell a tractor-trailor from an overhead road sign.

Are you freaking kidding me?

Instead of recognizing that this also makes the AI too stupid to be in charge of people's lives, what does Tesla do? Tunes it to ignore all large stationary objects.

Good grief. Where's that facepalm smiley when you need it?

Note that the problem is entirely to be expected if we start out with the understanding that the AI is no smarter than a cockroach. We cannot expect a cockroach or other insect to be able to tell the difference between a tractor-trailor and an overhead road sign. No wonder a cockroach-stupid AI can't, either.

The tragedy is that this stupidity is now starting to cost human lives, which is exactly what I've been worried about for the past twelve-plus years, ever since I watched a documentary film about the unexpected success of the 2005 DARPA Grand Challenge. You could tell even back then that it would only be a matter of time before half-baked self-driving cars were unleashed on a well-brainwashed human society.

It only takes 25,000 rat brain cells in a petri dish to fly a (simulated) aircraft ( Extracts - "Brain" In A Dish Acts As Autopilot Living Computer ). But the 25,000 cells have no understanding at all of what they're doing: they don't know up, down, front, back, fly, crash, live, die, or any of the other concepts a human pilot would. All the cells "know" is to regulate the voltage difference between the electrodes in the petri dish, which results in the simulated F22 flying straight and level. Fortunately, there are no tractor-trailors or medians or jaywalking pedestrians in the virtual sky.

The same is true for our insect-brained self-driving cars. The car's "brain" is too stupid to know any of those concepts, either. Not even fast, slow, stop, tractor-trailer, or "overhead road sign".

Computers are getting more powerful, but more slowly of late, as Moore's Law runs out of steam. It will be a long time before the factor of 100,000 is made up, and we have affordable computers with the computational power of a mammal brain, rather than an insect brain. Until that day arrives, I can't help but think that we have little reason to trust in the capabilities of self-driving cars.

-Gnobuddy
 
bob p 4/6/2018 3:49 PM
Moore's Law isn't a law any more. The problem is that to make computers faster, you have to increase clock speed; to increase clock speed you have to increase voltage; when you increase voltage, the energy that needs to be dissipated as heat is determined by the square of the voltage. obviously, computers can only be made to go so fast until it becomes a thermal design problem. That's why the CPU performance wars are over, and that's why the industry has worked on parallel processing and mult-core technology instead of running in the clock speed race. It was at least 10 years ago that companies like nvidia gave up on the speed race and poured all of their R&D into low voltage applications instead. Why low voltage? Heat.
 
bob p 4/6/2018 3:53 PM
Quote Originally Posted by Gnobuddy View Post
As I mentioned before, estimates of the computational power of present-day personal computers and microcontrollers suggest they are only capable of approximately one hundred thousandth of the computational ability of the human brain. This is about the same computational ability as an insect. We're not even up to reptile-level smarts yet.
With that in mind, you have to wonder why they haven't made the obvious choice:


[IMG]http://music-electronics-forum.com/attachment.php?attachmentid=48358&amp;d=1523051552[/IMG]

Instead of training computers to drive cars, maybe we should be training dogs.
 
bob p 4/11/2018 6:37 PM
Huang / Model X:

Tesla Blames Driver in Fatal Crash as Victim's Family Lawyers Up

At least Uber was smart enough to settle. Elon the Arrogant is going to make them go to court.

Quote Originally Posted by Tesla
The March 23 death of Walter Huang happened on a clear day, with several hundred feet of visibility ahead, the electric-car maker said in an emailed statement. Tesla had already said Huang, 38, didn’t have his hands on the steering wheel for six seconds before his vehicle collided with a highway barrier in Mountain View, California, and caught fire.

“The only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so,” the statement said. “The fundamental premise of both moral and legal liability is a broken promise, and there was none here.”
 
bob p 4/12/2018 4:18 PM
Tesla in Open Feud with US Safety Board over Crash Probe

Quote Originally Posted by Bloomberg
Tesla Inc.’s tense relationship with the U.S. National Transportation Safety Board boiled over Thursday with both sides accusing the other of making improper disclosures regarding a fatal accident under investigation.
...
The NTSB took the unusual step of stripping the carmaker of its role in an investigation of a fatal crash involving one of its vehicles, saying the electric-car maker failed to abide by an agreement not to disclose information while the probe was underway.
 
Steve A. 4/13/2018 12:30 PM
Here is one theory as to why self-driving vehicles are being forced on us... personal computer sales are way down along with PC software — everything is mobile this and mobile that. So everybody employed in the personal computer industry needed a new project to work on. Robots never caught on so someone came up with the idea of self-driving vehicles. "Great! That will keep us working forever because there are so many bugs that will never be solved!"

As mentioned before the one viable application for self driving vehicles would be 18 wheeler convoys of four trucks following the lead truck piloted by a human. As long as some asshole doesn't try to cut in it should work.

As for cars the applications would be much more limited like funeral processions for all of the victims of self driving cars...


[img]http://music-electronics-forum.com/attachment.php?attachmentid=48452&d=1523643968[/img]


Steve A.

..[ATTACH=CONFIG]48452[/ATTACH]
 
bob p 4/13/2018 3:50 PM
Steve, that's not a theory, it's a fact. Nvidia CEO Jensen Huang made that comment a while ago. They can't sell enough GPU to gamers any more, so they're focusing on bitcoin mining, supercomputing and machine learning. Nvidia has it's own SDC program and they're planning on making huge money by putting their processors into every car in the future.
 
nickb 4/13/2018 3:56 PM
Quote Originally Posted by bob p View Post
Steve, that's not a theory, it's a fact. Nvidia CEO Jensen Huang made that comment a while ago. They can't sell enough GPU to gamers any more, so they're focusing on bitcoin mining, supercomputing and machine learning. Nvidia has it's own SDC program and they're planning on making huge money by putting their processors into every car in the future.
A bit off topic but that makes me wonder if nvidea might be, or at least connected to, the secret inventors of blockchain technology?
 
nosaj 4/13/2018 3:59 PM
Quote Originally Posted by nickb View Post
A bit off topic but that makes me wonder if nvidea might be, or at least connected to, the secret inventors of blockchain technology?
Nintendo got it's start making playing cards for the Yakuza in the 1800's.

Entirely possible.

nosaj
 
Enzo 4/13/2018 5:13 PM
Like the VHS people are worried about tape sales. Microsoft doesn;t care how many desk tops they sell to old people, what they want is eyes on Microsoft content. On your lap top, tablet, phone, or yes desk top. Apple doesn't care if they sell amother Mac for the desk top, they want life long buyers of Apple apps and other products.

TVs are now digital, took a whole new set of hardware to receive. Did that change cable? No. More stations, organized different, still 500 stations for your hundred bucks or whatever. Wanna watch on your TV set or on your phone? We don't care.

SIlicon Valley and similar gave up working on desk tops ages ago. Self driving cars came along way too late to bump off desk tops. Look at regular cars, they all now have touch screen basically ipads on the dash. They are made to look like your phone.
 
g1 4/13/2018 5:38 PM
Quote Originally Posted by Steve A. View Post
Here is one theory as to why self-driving vehicles are being forced on us... personal computer sales are way down along with PC software — everything is mobile this and mobile that.
Here's another one, advertising:
https://www.vox.com/energy-and-envir...ising-business
 
The Dude 4/13/2018 5:52 PM
Quote Originally Posted by Steve A. View Post
......one viable application for self driving vehicles would be 18 wheeler convoys of four trucks following the lead truck piloted by a human......
Oh, you mean like a TRAIN?
 
g1 4/13/2018 6:05 PM
Quote Originally Posted by The Dude View Post
Oh, you mean like a TRAIN?
No, more like these guys, who are already out there working in the real world, but we'll just pretend they don't exist.

https://www.wired.com/story/embark-s...ck-deliveries/
 
bob p 4/13/2018 9:30 PM
Quote Originally Posted by g1 View Post
No, more like these guys, who are already out there working in the real world, but we'll just pretend they don't exist.

https://www.wired.com/story/embark-s...ck-deliveries/
Fact Check: The problem is not that the world is pretending that those Embrark-brand "autonomous trucks" don't exist, the problem is that you are pretending that they are something they are not.

According to SAE J3016, autonomous cars are ranked in six levels of autonomy (From Zero to Five).
SAE Standars 26262, Part II.

According to Embark their trucks use Level 2 (partial automation) just like Uber and Tesla. Embark admits that their vehicles operate under Level 2, and they someday hope that they will operate under Level 4. Full autonomy does not occur until Level 5.

Here are some facts that disprove the self-driving freight truck malarkey:

Level 2 Standards (in use by Embark) are defined as Partial Automation.
Partial Automation only allows Steering, Acceleration and Deceleration to be controlled by the computer. All monitoring of the driving environment is the responsibility of a human driver. All fallback performance of a dynamic driving task is the responsibility of a human driver. At level 2, the trucks speed up and slow down, and stay between the white lines. That's all. Level 2 has ZERO INVOLVEMENT in monitoring the driving environment.

Level 3 Standards (not in use by Embark) provide Conditional Automation in which the system monitors the driving environment, but takes no steps to intervene when a problem is detected. When any problem is detected, a human driver is expected to perform ALL intervention. Embark trucks aren't this sophisticated. They're still at Level 2.

Level 4 Standards (not in use by anyone) provide High Automation, rely upon the system to monitor the driving environment, and to intervene when a problem is detected.

Level 5 Standards for Full Autonomy, aren't even one of Embark's goals.

[IMG]http://cyberlaw.stanford.edu/files/blogimages/LevelsofDrivingAutomation.png[/IMG]
 
Justin Thomas 4/14/2018 6:04 AM
Quote Originally Posted by Gnobuddy View Post
It will be a long time before the factor of 100,000 is made up, and we have affordable computers with the computational power of a mammal brain, rather than an insect brain...
-Gnobuddy
By which time the computers will also be smart enough to reply, when asked to serve in a self-driving car, "screw you, you'll sue me out of existence and sentence me to death if I so much as hit a pothole and spill your coffee on your shirt!"

Justin
 
Steve A. 4/14/2018 6:43 AM
Quote Originally Posted by g1 View Post
No, more like these guys, who are already out there working in the real world, but we'll just pretend they don't exist.

https://www.wired.com/story/embark-s...ck-deliveries/
I have shared the idea of having robo-convoys of, say, five 18 wheelers led by a human driven truck on the freeways. There are so many crazy drivers on the freeway cutting in and out of lanes that at least for the near future you need a human driver in the lead vehicle. Besides, the robo-trucks would not have to be that smart with just one basic command to process, "I must follow the truck ahead of me." All of the vehicles in the convoy would be communicating with each other remotely so there would be no need for really complicated sensors to avoid any possible unexpected event.

I would suggest that such convoys have special purple lights to indicate to human drivers to not try to butt in, along with very clear signs indicating what they are.

I would venture to guess that they could have robo-convoys on the highways within 5 years pending government approval with no added dangers for human drivers. One problem to iron out is how to proceed once off the freeway. I suggested special parking lots near freeway on ramps and off ramps. You would want locking gates so that only robo-vehicles and their handlers could enter, both for security and to make sure that no one was using them as park'n'ride lots.

Steve A.

P.S. I uploaded a PDF file of the the article you posted. BTW I've been using these Samsung Tab 4 7.0 tablets for 3+ years and only recently figured out that they can create PDF files of practically any web page in Chrome - I am using the original version of Chrome (40.0.2214.109) that came with the tablet. I believe that later versions dropped that capability.

I mention that because I have tried all sorts of apps and plugins to avoid paying Adobe $10 or $12 a month to do that. Greedy bastards!

P.P.S. Within the PDF file there is a note from the Wired site that I only have 3 free articles remaining this month. One way to get around those darned limited access paywalls is to open the link in an incognito tab or browser. Once you close the tab or window your monthly limit is replenished. Shhh... keep it a secret so that they won't catch on!

.[ATTACH]48458[/ATTACH]

EDIT: Here is a link to the article about the self-driving convoys led by a human along with a PDF file of it in case you do not want to waste your 4 free articles this month. (Another way to get around limited access paywalls is to use a VPN.)

https://www.wired.com/2016/07/armys-...repare-battle/


.[ATTACH]48459[/ATTACH]
 
The Dude 5/7/2018 5:20 PM
https://www.cnet.com/roadshow/news/u...-car-accident/
 
bob p 5/8/2018 2:48 AM
Thanks for the link.

Uber had a problem with their obstacle detection software triggering too often, generating false positive warnings about obstacles that "unnecessarily" stopped their cars.

Tesla had a problem where a similar system would interpret signs hanging over the highway as solid obstacles that "unnecessarily" stopped their cars.

Both companies found the expedient solution was to turn off or tune down the safety feature because it was a nuisance.

In each case the Uber and Tesla had software that was sensitive enough to recognize "obstacles." The software worked. It triggered the safety protocols whenever it recognized something in the way and stopped the cars like it was supposed to. But the engineers didn't like that, because they wanted their cars cruising down the road, not stopping all the time. Because of all of the "false positive" warnings they turned off / tuned down the safety system until people got killed because the system could no longer recognize a "true positive" event. Tesla killed their passenger by driving into the side of a semi that Tessie thought was a road sign. Uber ran over Elaine Herzberg as she crossed the street.

In both cases, the companies decided to turn off safety features because they triggered "nuisance warnings." People died as a result.

I wish there were no settlements in these cases. Turning off the safety systems was willful wanton negligence. I'd like to see these negligence cases go to court. It could bankrupt the companies.