Last day of RSA Conference 2018 in SF the exhibits closed 3pm and I was hours early for my next meeting. Driving at the left turn from 5th Street south onto I-80 east I observed the road surface markings in my lane and didn’t make the sharp turn, but turned 90° left onto Bryant instead of I-80. In the rear mirror I saw all cars behind me apparently ignoring the law and making the sharp turn. I was in no rush, had the window down to feel the balmy air and hear the sounds of the city, relaxed, and felt quite content with my moral standing by obeying traffic law.
On Bryant heading east I followed the new directions spoken at me from Apple Maps, which I had given my destination before starting to drive.
I noticed a missing overhead traffic sign. I was wondering “was that an important sign?” but couldn’t figure what the missing sign’s content would have been. I figured probably it would redundantly tell something like “to I-80 east”. I went along with the stop and go traffic, following the many other cars, equally following the map app’s spoken instructions and, if I did look, the map app’s visual.
It was a pleasant spring afternoon, women were walking in skirts, and I made a point of looking ahead of me, not at the sidewalk, as I didn’t want to be a guy in a truck looking at younger women in skirts walking home from work. The sidewalk was three lanes away, two traffic lanes plus one parking lane.
Crossing 2nd Street I paid close attention to a uniformed person standing between lanes and regulating traffic. His head was about a foot from my rear mirror. His body size and his arms waving didn’t leave much room between him and my lane of traffic and the lane of traffic to the right of me. Furthermore requiring attention, I had to follow my lane’s offset to the left about half a car width while crossing the intersection. That public servant’s safety and the traffic lane layout took my full attention.
I faintly remember there was a sign saying a carpool lane was ahead. I expected to see a sign saying where that carpool lane actual is or starts, but I never saw a sign saying what lane, or where it starts. On the freeways I know, usually there is enough distance and time to move into a lane for compliance if not qualified for carpool use.
150 meters later, around the corner, on the Sterling Street on-ramp, I noticed a lot of law enforcement vehicles and officers. I was pulled over for a carpool violation, $481. The officer pointed at a spray painted over sign and said “there it is”. He told me there were enough signs leading up to it.
Via Google Street View I found out the missing overhead traffic sign had declared two lanes carpool lanes.
What is remarkable are three points:
• The map app sent me there, even though both lanes are carpool. I checked next day, same directions still. Those directions would be fine at 10pm, but they are not at 5pm. Those directions are fine for cars with multiple occupants, but they are not for a single occupant.
• The carpool lane overhead traffic sign was down that day, but tickets were issued still.
• I did not notice markings on the pavement because traffic was so thick, bumper to bumper, and I felt safe following the talking map app, following the crowd of other vehicles, and following the arm waving of the uniformed person regulating traffic.
I called Caltrans, and was not able to get a list of work orders for that week. I was looking for evidence of the sign being down, for use in court. The person on the phone was nice enough to point out Google Street View had a photo with another, close by overhead traffic sign down, another time. Apparently these signs had been down sometimes, for some kind of service.
Maybe my call made Caltrans talk with the law enforcement agency: My ticket was thrown out for being invalid, without further explanation given to me. A positive interpretation: People still understand context, and are compassionate.
All is well that ends well. Yet this is a cautionary tale about trusting machines. Not just about AI, but also about the content of databases.
That spring season was bad for us with Apple’s automated services more than once: A neighbor had left a voicemail, and the system sent a wrongly recognized transcription, so a kid thought it was a call to a wrong number and ignored it, because the transcription started with “Hi Liam…” and no one in the family goes by that name. Apparently our dog had been bitten by a snake or stung by bees. We still got the dog back, but only a day later from Animal Services.
We also for more than a day saw a wrong location of one kid in Apple’s people location app, Find Friends. The kid was home, his phone was here, but the app kept showing him in another town, two dozen miles away. Not as far as other times hundreds of miles off. By now this is a noticeable repeat of “wrong location information from a service”.
Also this is a tale about breaking the law in a moment where I was not expecting I was breaking the law, had no intentions to take advantage of anyone or anything, and when I even thought I was especially polite and careful.
Had I not avoided looking into the direction of women on the sidewalk, and had I paid less attention to the guy standing between lanes regulating traffic, I might have noticed one sign under a tree that says where the carpool lane starts.
At that sign, at that point it still could have been challenging to fast enough recognize both lanes are carpool, there is no single occupant lane on the Sterling Street on-ramp, and to get out of that flow. I am not there often enough to speak with certainty.
Also, dense traffic makes is a bit harder to see road surface markings.
Maybe there is another lesson to be learned too: Don’t drive in the city.