Making Sense Of Apple’s Self-Driving Car That Recently Struck A Curb In California And Owned Up To It

Have you ever been driving and perchance struck a curb?

I’m sure that you’ve experienced that gut-wrenching sorrowful moment.

So did an Apple self-driving car that was recently out and about in Silicon Valley during an autonomous driving journey.

To clarify, the AI autonomous driving system was presumably not gut wrenched and nor sorrowful, which would be an over-the-top anthropomorphic overlay (i.e., attributing human emotions to the AI), but it did indeed strike a curb and it did indeed lead to some minor vehicular damage (note that the damage was just to the self-driving car itself and reportedly not to any other vehicles or obstacles).

The Apple self-driving car curb strike occurred on a Monday.

I suppose we all know that Mondays seem to be the one day of the week upon which adverse driving incidents seem to customarily rear their ugly head. For human drivers, it is usually due to having had a ruckus of a weekend and being a bit mind-dulled upon heading back to work for the week ahead. That doesn’t seem to befit the Apple self-driving car instance and we can perhaps put to the side any Monday blues considerations.

Before getting further into the Apple self-driving car run-in with the curb, let’s take a moment to reflect upon the classic matter of curb striking. Some also internationally refer to this as “kerb” striking, but I prefer the more conventional use of the word curb per se.

Envision yourself at the wheel of your car.

Imagine that you are coming up to make a tight right turn and you miscalculate the positioning of your car. As you begin to round the turn, all of a sudden the rightmost tires rub up against the curb. The car shudders as it skims across the edge of the curb. You probably feel the steering wheel slightly go ajar as the vehicle is getting pinned between a rock and a hard place.

The impact with the curb will be stridently felt depending upon such factors as the speed of the car, the angle of the striking posture, the stiffness and resiliency of the curb, the height of the curb, and so on. These all come to play regarding the physical forces that will jolt you and your vehicle.

After completing the turn, you know in the recesses of your mind that you might have caused some untoward damage to your car. The vehicle is still moving along and you don’t hear any dreadful sounds coming from the side of the tires. That means that you probably didn’t shred the tires and nor completely knock out the side panels or otherwise cause grievous harm to your precious vehicle.

Nonetheless, the odds are that you’ve done some amount of damage to the car. There could be some lacerations to the sidewall of the tires that came in harsh contact with the curb. The tire valve stem might have been hammered and now is slowly leaking air.

Assuming that the tires survived without much harm, the likeliest next concern is that your steering facility and suspension components are demonstrably askew.

Think of misaligned tire rods, struts, and the like. When you have such issues, including wheel misalignments, you might not immediately realize the dangers that lurk within. You could end up further down the road and surprisingly find that you cannot properly steer the car, thus upon taking a subsequent turn the vehicle is nearly uncontrollable or at least is not well responding to your steering efforts and could become deviously and insidiously problematic.

You would be wise to have your car examined by a versed mechanic to identify any vehicular internal damages. Make sure to get the car repaired as needed.

Besides the potential harm to your car, there is something else that you might need to consider but that I doubt you would normally have put much thought toward.

Here in California, the DMV (Department of Motor Vehicles) has stipulated via their stridently published Vehicle Code (VC), a provision known as VC 22100 that imposes a penalty for making improper turns. Yes, that’s right, you can get into trouble for striking a curb since doing so is construed as an improper turn.  

You see, the Vehicle Code says that drivers need to make a right turn as close as possible to the right-hand curb or the right edge of the roadway. That being said, there is a difference between being close to the curb and actively striking the curb.

A driver that manages to do a tight right turn and perchance sweeps into the curb is making an improper turn. You can get a ticket for this infraction. Even though you didn’t hit anyone or otherwise cause any outward damages, this can still get you a point against your driving record and you risk ultimately scoring enough points to come under the negligent motorist license suspension. There is also a modest financial penalty, typically around $238 or so for the making of an improper turn.

I mention the legal mumbo jumbo about hitting a curb to proffer a bit of an explanation for why the Apple self-driving car that struck a curb was then somewhat compelled to be reported upon by Apple. I bring this up because some pundits have been surprised that Apple owned up to the incident. Those pundits figured that if the Apple self-driving car did not hit anything else other than the curb, this is a get out of the jail-free situation and Apple could have just pretended that it never happened.

Thus, those (with all due respect) ill-informed or ill-outspoken pundits that speculate on such matters were dumbfounded that Apple officially reported the matter. Just don’t say anything and nobody would know, that’s what some have been suggesting.

Hide the matter under the rug.

That would be pretty stupid and have inevitably undermined the totality of the Apple self-driving car efforts.

Here’s why.

You need to have some context about the self-driving realm.

The California DMV Autonomous Vehicles branch established an Autonomous Vehicle Tester (AVT) Program in 2014. This allows for the testing of autonomous vehicles on active public roadways and doing so with or without having a human driver in the driver’s seat. I’ve previously and extensively covered these matters, see my column analyses at the link here.

As background, there are essentially two major ways to test a self-driving car on public roadways.

One is via having a so-called safety driver that is at the wheel of the self-driving car during the entire time that the autonomous vehicle is driving around. The backup human driver is supposed to be closely monitoring the driving effort and be ready to instantly takeover the wheel as needed (these are typically referred to as “disengagements” in the parlance of this industry). For my detailed explanation about the appropriate training and use of backup human drivers, see the link here.

Those that want to test their autonomous vehicles on the public roads in California, usually file for and get granted a permit to do so while also ensuring that a human driver is at the wheel as a means of taking over from the AI driving system. There are 53 such AVT “with driver” permits currently issued.

Meanwhile, once an automaker or self-driving tech firm believes their self-driving car is ready to advance into the next stage, they apply for and potentially receive a permit to do AVT on the public roadways without the use of a human backup driver at the wheel. There are currently only 8 such “without driver” or driverless permit holders here in California.

A stated requirement for all of those autonomous vehicles is this important proviso: “Manufacturers need to report any collision that results in property damage, bodily injury, or death within 10 days of the incident.”

If an automaker or self-driving tech firm that is operating under a permit fails to comply with this provision, they presumably would have their permit revoked. This would indicate that they can no longer legally proceed to have their self-driving cars on public roadways. They would have to reapply. But it seems doubtful that a reapplication would receive ready approval if the automaker or self-driving tech firm had failed to earlier and properly report an incident that required reporting.

You would likely have quite an uphill battle to try and get reinstated.

In short, it would be like shooting your own foot. By having failed to report an incident, you risk not just getting knocked out of the game, for now, you might find it arduous if not impossible to, later on, get back into the game. For anyone seriously trying to advance their self-driving car project, being unable to actively practice having your autonomous vehicles on the public roadways would ostensibly put your project out of sorts and potentially at a kind of dead-end (well, at least to the extent of tryouts in that particular state).

Though striking a curb might seem like an innocuous matter, a decision to not report such an incident when there was in fact property damage done (in this case, damage to the autonomous vehicle) would have larger repercussions far beyond the mere incident itself. The prudent thing to do would be to go ahead and file an official report.

Not wanting to beat a dead horse, but if the pundit’s advice was followed, namely do not report the rather bland or inoffensive circumstance, here’s what could arise. First, someone would likely have seen the incident and possibly report it. Or, there is a chance that an internal whistleblower might bring it to light (see my coverage about a predicted rise of whistleblowers in the self-driving car niche, at this link here).

Once word leaked out about the matter, you can bet that the whole thing would be blown entirely out of proportion. Any company caught in such a perilous mode would be accused of hiding a material fact about their self-driving car operations. A scandal would certainly brew. Finger-pointing would take place. Who knew and when did they know? The story would blossom and become as big as if something truly outrageously dastardly had occurred.

Not a pretty picture.

The astute thing to do is report the incident and move on. Few would likely notice it. Of those that noticed it, they would have to note that the situation was no more than a curb strike. And, on top of that, the incident was dutifully reported and the rules of the AVT Program were abided with.

End of story.

Well, that’s the end of the story portion about reporting the incident.

We can though perhaps glean some fascinating aspects from the curb strike. Before doing so, I’d like to set some foundation for what I mean when referring to self-driving cars.

Understanding The Levels Of Self-Driving Cars

As a clarification, true self-driving cars are ones that the AI drives the car entirely on its own and there isn’t any human assistance during the driving task.

These driverless vehicles are considered a Level 4 and Level 5 (see my explanation at this link here), while a car that requires a human driver to co-share the driving effort is usually considered at a Level 2 or Level 3. The cars that co-share the driving task are described as being semi-autonomous, and typically contain a variety of automated add-on’s that are referred to as ADAS (Advanced Driver-Assistance Systems).

There is not yet a true self-driving car at Level 5, which we don’t yet even know if this will be possible to achieve, and nor how long it will take to get there.

Meanwhile, the Level 4 efforts are gradually trying to get some traction by undergoing very narrow and selective public roadway trials, though there is controversy over whether this testing should be allowed per se (we are all life-or-death guinea pigs in an experiment taking place on our highways and byways, some contend, see my coverage at this link here).

Since semi-autonomous cars require a human driver, the adoption of those types of cars won’t be markedly different than driving conventional vehicles, so there’s not much new per se to cover about them on this topic (though, as you’ll see in a moment, the points next made are generally applicable).

For semi-autonomous cars, it is important that the public needs to be forewarned about a disturbing aspect that’s been arising lately, namely that despite those human drivers that keep posting videos of themselves falling asleep at the wheel of a Level 2 or Level 3 car, we all need to avoid being misled into believing that the driver can take away their attention from the driving task while driving a semi-autonomous car.

You are the responsible party for the driving actions of the vehicle, regardless of how much automation might be tossed into a Level 2 or Level 3.

Self-Driving Cars And A Curb Strike

What can we infer from the aspect that an Apple self-driving car managed to recently strike a curb?

In one sense, it is quite a telltale clue, perhaps surprisingly so. We shall take a moment to explore the nature of the reported matter and then seek to unpack it.

According to the officially self-reported indication, the incident occurred on Monday, September 27, 2021, at 10:53 a.m. in the morning. The Apple self-driving car being used was their sensor-laden Lexus RX 450h and was a 2017 model. If you’ve not seen these vehicles, they are chockful of an avid outcropping of sensors such as video cameras, radar devices, LIDAR devices, and the like.

I see them quite often, roaming around the Silicon Valley area, along with the numerous other self-driving cars by other competing makers. You might think it would be shocking to see a self-driving car, but after a while, their presence almost seems mundane. I’ve reported previously that local human drivers have gotten so accustomed to self-driving cars that some drivers aim to bully or play tricks on the self-driving cars, see the link here.

Don’t do that. You are playing with fire and could cause quite untoward results. Enough said.

Let’s continue our exploration.

The Apple self-driving car was in the city of Sunnyvale and relatively nearby to various Apple buildings and an Apple campus site. There was some mild atmospheric cloudiness but otherwise, it was daylight, the roadway was dry, and no other unusual roadway conditions existed (such as if there had been loose gravel, or debris, or potholes, etc.).

It is worthwhile to keenly note the driving context (which, in the parlance of the self-driving industry, can be revealing and akin to the Operational Design Domain or ODD aspects).

For example, suppose the roadway was slick from rain. That could assuredly affect how the self-driving car would perform. Likewise, suppose it was nighttime and the darkness might somewhat “hide” the appearance of a curb or other roadway structures. Suppose the roadway was torn up and the tires were unable to have suitable traction. And so on.

In this case, it was pretty much a normal day here in late September in the Bay Area, relatively sunny with some cloud cover, predominantly dry roads (we don’t get much rain at this time of the year), and the like.

The Apple self-reported description of the incident is this: “A test vehicle, operating in autonomous mode in Sunnyvale and turning right from Mathilda Avenue onto Del Ray Avenue, made contact with a curb at approximately 13 miles per hour. While there was no tire or wheel damage, the contact resulted in misalignment. No other agents were involved, no injuries were reported, and law enforcement was not called to the scene.”

I decided to drive over to the scene and make that same right turn, going from Mathilda Avenue onto Del Ray Avenue. Admittedly, I did so a few weeks after the incident, and therefore the driving scene could be quite different than it was at the date of the actually reported matter.

It would though seem reasonable to suggest that there is nothing extraordinary about the right turn at that corner. All in all, it appears to be an everyday kind of corner.

As you are coming down Mathilda Avenue, there are four lanes of traffic. The rightmost lane can either make the right turn onto Del Ray Avenue or continue ahead on Mathilda Avenue. When nearing the right turn, there is a fire hydrant on the sidewalk, just a bit before the corner. As you come around the corner, there is a lamppost on the sidewalk just as you start to straighten out. Upon straightening out of the turn, there is room for cars to be parked on the right side of the street, though the street is wide enough that there is ample room between those parked cars and navigating further down Del Ray Avenue.

Allow me to explain why I am dragging you through those details.

If the corner was tricky in some fashion, it would lend toward the possibility that any kind of driving by either a human or an AI driving system might be befuddled by the turn. In this case, the corner appears relatively normal. The curb height is normal. The curb is the usual color. It does not seem to be obscured or in any manner undetectable. We also know that the matter occurred in daylight, though I suppose some shadows from nearby trees might have been cast (we’ll come back to that aspect in a moment).

Given the particulars, there is no obvious basis for striking the curb.

I can readily speculate that human drivers though might sometimes strike this curb.

I observed that as you drive down Mathilda Avenue, it is quite possible that other traffic behind you might be trying to push you along. When seeking to make that right turn at Del Ray Avenue, if perchance a hectic driver is behind you, and if the hectic driver is intending to proceed straight ahead, you could feel the pressure from that driver as you make the right turn.

You’ve probably experienced that before.

While making a turn off of an active roadway, a car behind you gets irked that you are slowing down to make the turn. They want to continue forward at the pace of prevailing traffic. You want to slow down to make the turn. They don’t want to slow down. This is a recipe for potential problems or disasters.

Sometimes, the driver from behind will ram into the car making the turn. That’s bad. Sometimes the hectic driver will switch lanes at the last moment and cause traffic in the adjoining lane to have to swerve. That’s bad.

The hectic driver is upset that the other driver is slowing down to make the right turn. This though is a legally allowed aspect, and the hectic driver should be adjusting to allow for the turning car to safely make the turn. That being said, if the turning driver does the turn by completely coming to a stop or doing something oddball-like, this could be said to be inappropriately disturbing the traffic that is on the active roadway.

We are all supposed to dance together collegially and civilly when driving our cars.

There is something about this right turn that does stand out though.

There is a ton of space to make the right turn. You could easily swing wide and have lots of room to do so. That is because there is a bit of a painted roadway diamond that separates the other side of Del Ray Avenue and for which there is only an allowed right turn when leaving Del Ray Avenue onto Mathilda Avenue.

Complicating the driving scene is the possibility of cars on Mathilda Avenue being able to make a semi-protected left turn onto Del Rey Avenue. In essence, cars going up Mathilda Avenue could decide to try and make a left onto Del Rey Avenue.

I’d say the odds are you’ve encountered this type of driving setting too. You want to make a right turn. You can do so without coming to a full stop since there isn’t any Stop sign or light controlling the turn. There is though a chance that someone might make a left and come onto the same street or avenue that you are making a right turn into.

You need to eye that left turn since a driver might decide to abruptly cut you off as you make your right turn. Or, you make the right turn, and the left turn driver darts into the same lane area as you are now immersed in. I could definitely see that a war of road warriors could arise at this turning point. The right turn driver going to battle with a left turn driver.

I realize you can use the right-of-way rules to figure out how this traversal should occur. But I dare say that right-of-way and the actual daily driving antics are not necessarily one and the same. A driver making this right turn needs to protect themselves and anticipate that a driver intending to make a left into the same roadway is going to be a threat. Whether the left turn driver is in the right is somewhat immaterial. The point being that the left-turn driver might be careless, might be distracted, might be drunk, or whatever.

Okay, in recap, a human driver could inadvertently strike the curb if they felt pressured by the traffic behind them when making the right turn. They might oversteer in their “panic” of making the right turn and be trying to avert getting rammed from behind by some crazed following driver.

Another means of striking the curb would be if you were making the right turn and were somewhat startled to see that a car making a left turn has decided to enter into the roadway at the same point of your turning activity. You might try to avoid the interloper car by oversteering into the curb.

Per the Apple self-reporting succinct description, we do not know whether or not that there was traffic directly behind the self-driving car as it made the right turn. We also don’t know if a car was making a left turn onto the same roadway at the time of the incident.

We do know that the self-driving car was moving at a relatively slow speed, namely at 13 miles per hour.

The speed is slow enough that we might infer that the self-driving car was making the right turn at a reasonable pace. In other words, if the self-driving car had been going say 35 to 40 miles per hour on the active roadway, and then tried to take the turn without reducing speed, we would seemingly question why the AI driving system had not slowed to make the turn.

We do not know if the self-driving car was trying to make any evasive maneuvers. One doubts that it was doing so. This would seem like something that ought to have been mentioned if indeed it was related to the curb strike. We’ll assume that there was no evasive action underway.

My overarching point is that we can come up with lots of potential underlying reasons that a human driver might strike that curb. This would partially arise due to the design of the corner and the roads that intersect there. The mainstay would be that oversteer might occur by a human driver based on the perceived angst due to a driver behind them or a driver trying to make a left turn into the same roadway path.

As far as we know, there were no other drivers nearby that somehow led to or were amidst this incident. No other human-driven cars seemed to parlay into the situation. Likewise, no other self-driving cars seemed to parlay into the situation (that’s a bit of a new notion, namely that we nowadays have the added possibility of a self-driving car that might be nearby another self-driving car while both are actively underway on a public roadway).

Shift for a moment into a generalized perspective about self-driving cars.

Keep in mind that some AI driving systems might not especially note that a driver in a human-driven car that is behind the autonomous vehicle is getting upset and tailgating the self-driving car. Some AI driving systems basically do not take into account such actions. The programming of the AI driving system is focused on making the turn and it is little or no coding that deals with cars behind the self-driving car during that maneuver.

Humans of course do tend to take that into account (not all, but many do).

Many AI driving systems tend to focus their sensors and multi-sensor data fusion (MSDF) on what is ahead or in front of the self-driving car, rather than giving much credence to what is behind the autonomous vehicle. In that case, there is a chance that a left-turning car that might impede the self-driving car would be detected.

I say that there is a chance because there is also a chance that such a left-turning car might not be detected. It all depends upon how well seen or detectable the left-turn car is. If there was a lot of traffic at the time of making the right turn, it is conceivable that being able to detect a car that is several lanes away and entering in real-time into the left turn would be somewhat difficult to discern. Of course, once it proceeded to make the left turn, the odds are that the right turning self-driving car would get a strong sensory detection of the interloping vehicle.

This brings up a whole slew of other aspects. For example, if a self-driving car was making a right turn and another car suddenly made a left turn that imperiled the self-driving car, what would the AI driving system do?

It all depends upon the programming.

The AI driving system might try to slam on the brakes and come to a halt. Or, it might try to stay as close to the curb as possible, aiming to stay away from an interloper, and still proceed to make the turn. In theory, the AI driving system could even calculate whether to go up onto the sidewalk, avoiding the interloper as much as possible, if it seemed that staying in the roadway was going to highly likely produce a collision.

None of those considerations appear to be a factor in this instance.

For unexplained reasons (per the Apple self-report), the self-driving car struck the curb. That’s all we know.

Given that we are currently in the month of October and nearing Halloween, you could lightheartedly dismiss the circumstance by saying that a ghost might have caused the curb strike (the spooky apparition having jumped the gun on celebrating Halloween, as it were).

Keeping our heads in the serious game of real-world self-driving cars, consider something not quite so supernatural.

One of the foundational aspects of any AI driving system is the programming required for properly navigating the roadways. This involves detecting the drivable road space. Humans do so with ease, most of the time. You glance up ahead and can readily see that the road goes this way or that way, and you can readily see that the curbs are here or there.

Curbs can be tricky.

Some places have no discernable curb. Some places have a curb that runs along and then no longer continues. There are breaks in curbs. Curbs can be cracked and in piecemeal. Some curbs are painted red and you aren’t supposed to stop there. Curbs are a vital aspect of being a driver, though we tend to give little explicit regard to the importance of curbs.

For many of the AI driving systems in use today, a herculean effort is made to pre-map where the autonomous vehicle is going to go. Besides purchasing or making digital maps that showcase the roads and their structures, including the curbs, another additional aspect involves driving around a given area and further enhancing the digitized maps.

Some fervently believe that the future of self-driving cars is tied to having those kinds of strictly detailed pre-mapped and pre-driven datasets. Others decry this as a sad reliance that, notably, humans do not need for driving purposes. When you go to drive in a locale that you’ve never driven before, you don’t need to have detailed maps per se. You merely start driving. You observe the roadway as the roadway arises. This is what a true self-driving car and a true AI driving system ought to be able to do, some argue voraciously.

Returning to the Apple self-driving car matter at hand, we would likely expect that this locale would have been pre-mapped and pre-driven, a logically reasonable assumption given that it is in Sunnyvale and nearby a complex of Apple facilities. It seems entirely unimaginable that this corner was an unknown corner and that the AI driving system would not have already had lots of data about the corner.

Put on your sleuth-like thinking cap.

Somewhat like a Sherlock Holmes detective scenario, we seem to have eliminated the car-behind theory, the left-turning car theory, and the “never seen this” theory about the location of the curb strike.

Just for sake of discussion, we will also eliminate the “steering problems” theory that perhaps the mechanics of the steering on the self-driving car were already afoul. The basis for asserting that the steering was likely working and not somehow askew is that by and large the self-driving cars in these tryouts are being pampered beyond belief. Each night, the self-driving cars throughout the Silicon Valley area are usually brought back into their respective maintenance and repair facilities and are given a careful inspection. They are then maintained and repaired. Furthermore, when starting on a trip the next morning, usually a complete inspection is done again to make sure that the autonomous vehicle is suitable to go on a driving journey.

We are left with the notion that the AI driving system presumably had sufficient data about where the curb was. The weather and roadway conditions did not seem to be a factor since it was a clear day and during daylight that this occurred. No other cars apparently were involved. The roadway surface was not a seeming factor. The condition of the self-driving car itself was not a seeming factor (we are assuming it was physically operating as normal).

It would appear that the only thing remaining to deduce is that the butler did it.

The AI driving system might have miscalculated the right turn and ran into the curb.

Perhaps the right-turn programming had been recently updated or modified, and a bug or error was now included that led to the turn being taken overly so.

There is also a chance that the sensors detected something that is not apparent now but was considered a concern at the time by the AI driving system. Some nearby object was misinterpreted as an issue and thus the AI driving system directed the self-driving car to take as tight a turn as feasible, including rubbing against the curb.

Striking a curb is pretty much a simple matter and maybe not worthy of giving any undue attention. On the other hand, navigating a turn and properly avoiding striking a curb is considered the Self-Driving Car 101 kind of stuff. You would be extremely hard-pressed to claim that an everyday curb is an outlier or edge or corner case.

One of the biggest and most common refrains about developing AI driving systems is the qualms about addressing those outliers, edge cases, and corner cases. Some believe that the long-tail aspects of zillions of such unusual cases will demonstrably delay the advent of self-driving cars, and there are even skeptics that say it will forever delay safe self-driving cars, see my discussion at this link here.

Anyway, it is a bit of a head-scratcher that a curb strike of this nature occurred. Maybe some additional AI programming and coding nourishment is needed.

As they say, an apple a day can keep the curb strikes away.


YouTube video