Dashboard Overview
The tapioka.ai dashboard is your mission control for AI-driven quality assurance. It provides a comprehensive view of your projects, testing infrastructure, and execution metrics.
Project Management
Upon logging in, you are presented with the Project List. Each project is represented as a card showing:
- Platform & Name: Choose from supported platform and give your project a name. Platform cannot be changed later.
- Key Metrics: Number of Test Suites and active Test Cases within that project.
- Quick Links: Fast access to the project's test library or recent runs.
Test structure
- The Project is the highest level of the hierarchy. It encompasses the entire codebase, configuration, and environment settings required to run your tests.
- A Test Suite is a logical collection of test cases grouped together. Suites are usually organized by feature, module, or priority (e.g., "Smoke Tests" or "Billing Module").
- The Test Case is the fundamental unit. It validates a specific behavior or a single "path" through the application.
- Pre-conditions: Setting the specific state needed for the test (e.g., logging in).
- The Scenario: The actual interaction, such as clicking a button or navigating to the given screen.
- The Expected Result: The "moment of truth" where the expected result is compared against the actual result.
Usage & Resource Monitoring
The dashboard includes real-time widgets to help you manage your SaaS resources:
- Current Month Usage: Tracks total execution "minutes used" across all workers.
- Storage Consumption: Monitored in GBs (e.g., 24.81 GB), covering test artifacts, videos, and visual evidence.
- Billing & Plans: Direct access to manage your current tier (e.g., "Premium - Pay as you go") and view invoices.
Core Navigation
The sidebar provides deep-link access to the platform's four pillars:
- Tests: The repository of your test suits and individual cases.
- Runs: The execution engine where you trigger batch runs and monitor history.
- Analyse: Advanced reporting and AI-aided results analysis.
- Devices: Management of your virtual and physical device fleet (Android, iOS, Web, Desktop).
Test Library Interface
Inside a project, tests are organized into Suites (e.g., "Smoke Tests", "Regression Pack"). The interface provides clear status indicators:
- Learning Status: Shows the percentage of the test scenario that the AI has successfully "learned".
- Execution States: Visual labels for
Passed,Failed,Active, orScheduledruns. - Device Assignments: Displays which hardware an execution is targeting (e.g., Samsung S23).