The point of the test is to demonstrate that vision-only, which Tesla has adopted is inadequate. A car with lidar or radar would have been able to “see” that the car was approaching an obstacle without being fooled by the imagary.
So yes, it seems a bit silly, but the underlying point is legitimate. If the software is fooled by this, then can you ever fully trust it? Especially when sensor systems exist that don’t have this problem at all. Would you want to be a pedestrian in a crosswalk with this car bearing down on you in FSD?
The point of the test is to demonstrate that vision-only, which Tesla has adopted is inadequate. A car with lidar or radar would have been able to “see” that the car was approaching an obstacle without being fooled by the imagary.
So yes, it seems a bit silly, but the underlying point is legitimate. If the software is fooled by this, then can you ever fully trust it? Especially when sensor systems exist that don’t have this problem at all. Would you want to be a pedestrian in a crosswalk with this car bearing down on you in FSD?