Testing vehicle software is often complex and time-consuming, as it must account for numerous vehicle variants, driving modes, and regional requirements. The process also requires extensive collaboration and review cycles across multiple teams. To improve both efficiency and accuracy, the Arene Testing Platform provides end-to-end support for test engineers—from test creation and execution to result evaluation, and ultimately enabling intelligent, assistant-driven testing over time.
Solo designer, UX researcher, product owner, engineer teams, client team
E2E design & research throughout 1st launch and concept design
2023 Dec - 2024 Jun
Testing vehicle software is often complex and time-consuming, as it must account for numerous vehicle variants, driving modes, and regional requirements. The process also requires extensive collaboration and review cycles across multiple teams. As software-driven development began to reshape testing practices, traditional waterfall workflows evolved into automated, pipeline-based processes.
One of the most significant shifts we identified was that, as development processes became more software-focused, existing roles, responsibilities, and team structures no longer aligned with the new workflows. While studying current archetypes, we also identified gaps and overlaps that indicated the need for change.
As a result, we not only documented existing archetypes but also proposed a new set of archetypes better suited to a software-driven development model.
The testing process consists of three primary stages: test creation, execution, and evaluation.
Across all three stages, we identified a consistent pattern in user pain points—a strong desire for intelligent assistance to reduce errors and minimize repetitive, manual work as testing scales increase.
Smart CI/CD testing pipeline and font-end based assistance in test creation and execution.
Creation module UI & synctax support for test case writing

Provide system context and auto-navigation feature when debugging test cases

Building the POC front end was intended to give users an experimental playground where they could explore concepts, try workflows, and provide early feedback. We continuously conducted interviews and collected user feedback, using these insights to iterate on the design.
The next step was to more tightly integrate the testing tool with the rest of the platform and the backend pipeline, enabling a more seamless end-to-end testing experience.