Autonomous driving is on the horizon. Vehicles with partially automated driving capabilities are already in the market. Before the widespread adoption however, human factors issues in the automated driving context need to be addressed.
One of the key components of this is how much drivers trust in automated driving systems and how they calibrate their trust and reliance based on their experience. In this paper, we report the results of a survey conducted with Tesla drivers about them experiences with two advanced driver assistance systems, Autopilot and Summon. We found that drivers have high levels of trust in Autopilot and Summon.
Don’t waste time! Our writers will create an original "Tesla’s Autopilot system" essay for youCreate order
Trust decreased with age for Autopilot but not for Summon. Drivers who experienced unexpected behaviors from their vehicles reported lower levels of trust in Autopilot. Over time, trust in these systems increased regardless of experience. Additionally, trust was correlated with several attitudinal and behavioral factors such as frequency of use, self-rated knowledge about these systems, and ease of learning. These findings highlight the importance of trust in real world use of autonomous cars. Also, the results suggest that previous findings on trust are applicable to real world cases as well.
Vehicles with partially autonomous driving capabilities recently became available. Currently, tens of companies are involved in building advanced driving automation systems and self-driving cars. Before these technologies become widespread available, there is a need to understand how drivers will behave in partially automated vehicles, vehicles that require constant monitoring by the driver (Level 2 in SAE classification). A key element in understanding driver-automated vehicle relationship identifying psychological characteristics of drivers and cognitive processes that influence how drivers use these technologies. One important component of these processes is trust in automation. This study aims to understand the role of trust in driver-vehicle interaction in the context of autonomous vehicles.
Tesla drivers have been enjoying advanced driver assistance systems for some time. In 2015, Tesla introduced two advanced driver assistant systems, Autopilot and Summon. Autopilot is a combination of lane steering assistance and adaptive cruise control, allowing hands free driving in a limited context.
Summon is an automated parking system which allows the vehicle to maneuver into and out of the garage using a smartphone app. Previously, we reported drivers’ general attitudes towards these features, and how they use these systems.
The current study builds on that by investigating Tesla drivers’ trust in Autopilot and Summon. Tesla’s Autopilot system, along with other advanced driver assistance systems (ADAS) are far from being perfect and failures are common. Given this imperfection, acritical issue is the degree of reliance on these automated systems. If drivers completely rely on the capabilities of these vehicles, negative consequences during automation failures will be inevitable. An example is a recent fatal Tesla crash. On the other hand, if drivers don’t rely on these systems at all, the opportunity to save more lives thanks to automation. being superior under certain circumstances will be missed. An important concept, trust, can help us in understanding how appropriate reliance can occur. In this work, we will present findings of an online survey on Tesla drivers’ trust and confidence in Autopilot and Summon.
Trust has been a fundamental concept in human-automation interaction. Inappropriate calibration of trust in an automated system can lead to misuse (overreliance) and disuse (under reliance) of automation and result in decreased performance and less adoption. There has been considerable research on trust in automation for a review, for a meta-analysis on factors influencing trust. Lee and See identified three factors that are critical in trusting an automated agent: performance, process, and purpose. Performance refers to operator’s observation of results, process refers to operator’s assessment of how the system works, and purpose refers to the intention of the system. These dimensions should match with each other in operator’s mind to establish appropriate levels of trust. For example, if observed performance matches the operator’s understanding of the system (process), then appropriate levels of trust can be developed.
Trust and reliance on automation increases as perceived reliability of the automation increases. Trust seems to act as a precursor to reliance and mediate the relationship between beliefs and reliance It decreases with automation error but providing explanations of why the error occurred (observing the process) can increase trust and reliance despite the errors. Also, trust is more resilient when an automation error occurs if the operator has the ability to control and compensate for these errors. In addition, the type of automation error also influences trust and reliance differently. For example, increased false alarm rates result in less reliance on automation while alarms that are accurate but not needed by drivers increase trust. Trust in automation increases over time, especially if there are no major failures , and regardless of prior exposure to automation errors It can even increase over time without constant exposure to the automated system. Age can also affect trust in automation. Older people tend to have higher levels of trust in automation. Findings regarding how older people calibrate their trust and reliance are mixed. While some studies showed that they may use different trust calibration strategies, others did not.
Taken together, these findings show how important trust is in reliance on automated systems. However, most of the research discussed are primarily laboratory studies and there is a lack of research in the real-world usage of autonomous cars. To fill this gap, we will present the results of a survey we conducted with Tesla drivers, asking about their attitudes towards Autopilot and Summon along with their trust. Based on previous work, we expected trust to be related to frequency of use, increase over time, negatively affected by experiencing an incident, and increase with age.
In the following analysis, we used data from Autopilot users (N = 109) for trust in Autopilot and data from Summon users (N = 99) for trust in Summon. We compared initial and current trust for Autopilot and Summon. We also examined the relationship between trust and other measures, as well as how experiencing an automation failure influences trust in Autopilot and Summon.
Overall, participants reported high levels of trust in Autopilot (M = 4.02, SD =.65) and moderate levels of initial trust (M = 2.83, SD = .82). As shown in Table 1, trust in Autopilot was positively correlated with frequency of Autopilot use, self-rated knowledge about Autopilot, ease of learning, and usefulness of Autopilot display. Surprisingly, for those who experienced an Autopilot incident (N = 68), trust was not correlated with how risky they perceived the situation. However, perceived risk was negatively correlated with frequency of use.
A one-way ANOVA showed a significant age effect on trust, F (6, 102) = 2.63, p = .02, partial ?·2 = .13. There was also a significant linear trend, F (1, 102) = 7.80, p = .006. As shown in Fig. 1, trust in Autopilot slightly but significantly decreased with age.
Next, we compared Tesla drivers’ initial and current trust on Autopilot and how experiencing an incident (Incident group) or not (No Incident group) affects trust. A 2×2 mixed ANOVA with trust in Autopilot as a within-subjects factor (Initial Trust, Current Trust) and Autopilot incident as a between-subjects factor (Incident, No Incident) showed a main effect of trust, F(1, 107) = 221.05, p < .001, partial ?·2 = .67, and a main effect of incident, F(1, 107) = 9.59, p = .002, partial ?·2 = .08. The interaction effect was not significant, p = .086. As shown in Fig. 2, trust in Autopilot was higher than initial trust, and those who experienced an Autopilot incident reported lower levels of trust. Surprisingly, they also reported lower levels of initial trust in Autopilot.
Participants (N = 99) reported high levels of trust in Summon (M = 3.80, SD = .93) and moderate levels of initial trust (M = 3.11, SD = 1.01), similar to Autopilot. As shown in Table 2, trust in Summon was positively correlated with self- rated knowledge about Summon, and ease of learning. Current trust was positively correlated with frequency of use, and initial
trust was positively correlated with computer expertise and negatively correlated with perceived. For those who reported a Summon incident (N = 21), initial trust but not current trust was negatively associated with perceived risk of the situation. A one-way ANOVA showed no effects of age on current trust in Summon, F(6, 92) = 1.78, p = .108. Trust in Summon did not differ across age groups.
A 2×2 mixed ANOVA with trust in Summon as a within- subjects factor (Initial Trust, Current Trust) and Summon incident as a between-subjects factor (Incident, No Incident) show a main effect of trust, F (1, 97) = 23.52, p < .001, partial ?·2 = .20. Current trust in Summon was higher than initial trust (Figure 2). The main effect of incident was not significant, F (1, 97) = 1.05, p = .309; the interaction was not significant as well, F (1, 97) = 2.74, p = .101.
In this work, our goal was to identify how Tesla drivers’ trust in Autopilot and Summon relate to attitudes towards these systems, and how experience shapes their trust in these systems. Overall, we observed high levels of trust and moderate levels of initial trust. Trust increased over time regardless of whether participants experienced an incident. Trust in Autopilot but not Summon decreased as the age increased.
High levels of trust reported for both Autopilot and Summon indicate that the drivers are confident in these systems which is in line with previous findings . Analysis of correlations revealed interesting patterns. Frequency of use of Autopilot was associated with trust. As expected, those who have higher levels of trust tend to use the system more often . However, the reverse is also true: The more drivers experience Autopilot and Summon under different circumstances, the more their trust increases, which supports previous findings on the relationship between trust and experience , . Ease of learning was also positively correlated with trust for both Autopilot and Summon. The design features of automation such as usability influence trust by altering perceptions of users. Likewise, easy to learn characteristics of Autopilot and Summon may have created perceptions of trustworthiness by making the adaptation
This work had several limitations. Unlike laboratory experiments, trust was not assessed immediately after the incidents, and the time interval between the last time the drivers experienced an incident and the survey varies from person to person. A longitudinal study on how trust develops over time with autonomous vehicles would identify both fluctuations in trust and how drivers psychologically deal with automation failures. Also, while we observed that trust was associated with multiple factors, identifying exact mechanisms require further research such as how age, knowledge and mental models influence trust. Trust evolves over time, and while trust influences reliance on automation, it is not the only factor. Future research should also examine the affective component of trust in autonomous cars. Our observations throughout this work have been that there is more than meets the eye when it comes to developing a trust relationship with people’s own cars, where factors such as their attitudes towards the designer (i.e. the brand or company producing the vehicle), public opinions and social influence might play an important role. Therefore, it is critical to develop an understanding of the concept of trust in personal automation such as personal cars and home automation.
We examined trust in automation in the context of Autopilot and Summon. Overall Tesla drivers reported high levels of trust in these technologies. Trust was related to several attitudinal and behavioral factors, and experience shaped the level of trust in these technologies. While this work was an initial step towards understanding how trust plays a role in real world use of autonomous vehicles, it showed that laboratory findings and concepts developed in the research community are applicable to real world cases as well. We hope these findings will help to understand drivers’ trust in autonomous vehicles, as the concept of trust will be fundamental in an automated world.
We will send an essay sample to you in 2 Hours. If you need help faster you can always use our custom writing service.Get help with my paper