The first scenario is asking whether a self-driving car should prioritize larger vehicles, such as SUV's, or smaller vehicles, such as Mini Coopers. I don't think the car should be programmed to hit any specific car. If the self-driving car is programmed to focus larger vehicles, then large families are put in danger. If the self-driving car is programmed to focus smaller vehicles, more damage will occur. I think that the decisions of where the car should crash should be random, like it is today. If the choice is random, then everybody in every single car has an equal chance of being hurt in an accident, but also has an equal chance of being safe in an accident.
The second scenario asks whether a motorcyclist with a helmet should be prioritized in an accident, or a motorcyclist without a helmet. The motorcyclist without a helmet should be prioritized in an accident. Prioritizing the motorcyclist without a helmet will encourage helmet wear. Although there will be more damage dealt to the motorcyclist without a helmet, the use of helmet wear in the future will overcome that consequence. If you don't follow the law and neglect wearing a helmet, you kind of deserve to be prioritized.
I think it's a good idea for the car to make random decisions using a random number generator. Today, without self-driving cars, all accidents are random and there isn't an extra computer or mind that adds a burden to anybody in the accident. If the car were to make random decisions, then the chance of being hurt and the chance of being safe are equal among all the cars involved in accidents.
If the driver is not making control decisions, the driver should not be responsible for the outcomes. If you're not even driving the car and you get in an accident, then you did nothing to cause the accident and you should not be held responsible. If you were driving the car during the accident, then yes, you should be held responsible. But if the accident was caused by a machine's error, then you had nothing to do with it.