Automated vehicle testing & evaluation process

The objective of the Automated Vehicle Testing and Evaluation Process (ATEP) is the development and validation of a safety case process to enable the safe deployment of automated driving system (ADS)-equipped vehicles (AVs) within AZ and across the nation to create a transparent and consistent methodology for quantifying safety. The deliverable is a validated safety case-based process, focusing primarily on scenario-based testing but including other safety case components, that can be used to establish the operational safety of an AV to a demonstrated level.

This work will leverage existing progress that has been made over the three phases of the IAM Metrics Project, and the ATEP Mission and Metrics Project will proceed in parallel in a complementary manner. The deliverable for each ATEP Mission task is described below.

  1. OSA metrics definition: This is a joint IAM/SFAz task. The set of OSA metrics need to be finalized and defined. This work will be done in coordination/collaboration with the Verification and Validation (V&V) Task Force under the SAE On-Road Automated Driving (ORAD) Committee, where a standards document (J3237) is being developed. The deliverable for both the Metrics Project, Phase 4 and for the ATEP Mission will be the J3237 document. (Note that the SAE J3237 document effort is a volunteer effort on a separate track from the ATEP Mission; the timeframe for this task is thus done in parallel with the other tasks to avoid a prolonged mission.)
  2. CARLA-based scenarios and metrics calculations: The (Python-based) simulation tool CARLA will be used to develop simulation-based scenarios for which the OSA metrics can all be calculated. The deliverable will be the 37 pre-crash scenarios that NHTSA has outlined as responsible for some 97% of light-duty vehicle crashes being developed in CARLA with OSA metrics measurements..
  3. Scenario database: In order to conduct scenario-based testing, a database of scenarios from which to choose must be developed and our sourced (e.g., leveraging ScenarioPool). This database will eventually include multiple ODDS, but the deliverable will be a database consisting of the scenarios applicable to a single ODD (the ODD of the ASU test AV). The database will include entries from the real-world scenarios (Metrics Project) and simulation scenarios (CARLA-based scenarios and metrics calculation task).
  4. Fidelity, relevancy, and complexity of a scenario: The fidelity of closed course testing and simulation testing must be established (public road testing has a fidelity of 1). The relevancy and complexity of each scenario in the database from #2 (Scenario database development) must also be established. The deliverables will be (1) closed course testing and simulation testing fidelity databases for various equipment and facilities and simulation tools, respectively; and complexity and relevance associated to each scenario in the database.
  5. Scenario-based testing methodology: The scenario-based testing methodology will include a method for defining the ODD for the test AV as well as a set of behavioral competencies in that ODD. Based on the OSA Methodology from the Metrics Project as well as the fidelity, relevancy, and complexity determinations (from task 9 above), a method for determining the OSA Methodology “score” must be developed for a given scenario navigation. This will require varying methods depending on the metric and also the test method (simulation, closed course, and public road). The deliverable will be the testing methodology.
  6. Validation of Scenario-based testing methodology: This is a joint IAM/SFAz task. Using the ASU test AV, a validation of the ATEP will take place. This will require modeling of the AV in simulation, testing in simulation, and then validating the simulation results in closed course and public road testing. The deliverable will be evidence that the ATEP can be used to provide an assessment of the operational safety of an AV from scenario-based testing.
  7. Safety case framework: The scenario-based testing is one element of an overall safety case. Other safety case elements include adherence to design standards and best practices, architecture standards and best practices, and safety management system standards and best practices. The deliverable will be to combine all of these other elements with the scenario-based testing element for an overall safety case framework that is structured as a process that an AV developer (or third party such as an Infrastructure Owner/Operator (IOO) can use to demonstrate the overall safety of an AV.