
Aeyeksy
Enabling Smart Observing Services for Casinos
Year
2025
Project Length
5 Month
Platform
Desktop Application
As the product designer of Aeyesky, I redesigned the workflow by balancing the AI detection and human employee surveillance, prioritizing on delivering accessible user experiences.
My Role
Product Designer
Timeline
Oct 2025 - Now
Team Member
1 Product Manager
1 Product Designer (Myself)
1 Algorithm Expert
2 Developers
Tools
Figma, Teams

Problem
A Reliable Smart Surveillance System with AI Data
While transitioning from manual video review to AI-driven surveillance promises operational efficiency, it introduces a critical trust gap for casino staff. Although the system automates data collection and analysis for table games, AI leads to inevitable inaccuracies that traditional teams are hard to manage.
Black Box Uncertainty
AI detection often lacks transparency, making it difficult for users to understand why a specific fault was flagged and how it could be solve. Without clear communication of AI, inaccuracies can compromise the system's perceived reliability and analytical credibility.
The Cognitive Barrier
Casino officers typically lack prior experience in AI workflows or dataset cleanup. Moving from intuitive manual observation to a data-driven interface creates a steep learning curve, requiring a design that empowers non-technical users to verify and correct AI outputs seamlessly.
Research & Insight
Understanding Our Users
There are 3 types of users that current Aeyesky's SaaS targeted for.

01
Surveillance Officer
Monitor gaming, review footage and ensure regulatory compliance.

02
Game Protection Analyst
Protect game and identify procedural errors through data and visual analysis.

03
Manager
Oversee the entire security operation
Misunderstanding of Raw Video
01.
While users rely on real-time previews to monitor table status, video thumbnails alone prove insufficient for rapid decision-making, including identifying dealer faults or performance trends.
The Calibration Training Barrier
01.
While users rely on real-time previews to monitor table status, video thumbnails alone prove insufficient for rapid decision-making, including identifying dealer faults or performance trends.
The Fragmented Work Loop
02. 03
The current communication chain between surveillance officers, managers, and analysts creates a operational disconnect because analysts are often brought in late to categorize cases that have already been flagged, lack of real-time table awareness.
The Reporting Disconnect
01. 02. 03.
While analysts and surveillance officers are tasked with reporting significant faults and performance trends to management, a gap exists in the documentation workflow. The current process makes it difficult to isolate specific time periods or translate complex AI data into accessible, high-level summaries for stakeholders.
Solution
Transforming a complex casino surveillance system into a automated process.
User Goal
Bridging the trust gap through verification
By transforming complex behavioral data into transparent, actionable alerts, the users are seeking for the system without the cognitive load and allows them to verify or correct AI-detected anomalies.
Business Goal
Maximizing Revenue Without Additional Labor
The primary business goal of Aeyesky is to safeguard casino revenue by automating the surveillance process, eliminating the need for costly, labor-intensive expansions of the monitoring team.
TASK 1 - Listen to Users' Feedback
Streamline the Auto-detection for Incidents
01
Painpoint
Design Strategy
01
Cognitive Overload: Forcing the officer to manually categorize Results creates a bottleneck
Break down complex categorization into simple, sequential Yes/No logic gates
Separates Operational Action (Call Pit Manager) from Strategic Analysis (Data Analytic)
02
Distrust in AI
Label AI-detection for reference
Provide an entry for reporting errors
When a human reports a "No" at the authenticity gate, that data is routed back to retrain the model.
Intial Design

Redesign the User Work Flow

Final Design

TASK 2 - Trade off with Dev Team
Review Dealer Fault Incidents
01
Painpoint
Design Strategy
01
Onboarding is hard.
Have suggested steps when the user prompt the edit the calibration page
02
Officers are unable to recognize the errors without previous context.
Add alert tooltips when the area is not being labeled
Set a toolbar identify all the possible actions with annotations.
Show tooltips to prompt for the possible action within the user flow.
Dev Feedback
State and Risk Management
During the hand-off, developer feedback revealed a need for clearer canvas interaction states (Idle, Active, and Dirty) to define how the UI responds during edits. To mitigate the risk of data loss during this time-intensive process, I implemented auto-save triggers and progress recovery flows, ensuring users can resume from session interruptions.
Responsive Design
The engineering team also raised a concern regarding spatial data integrity across different screen resolutions. Currently, the canvas relies on absolute pixel coordinates. If an officer calibrates a table on a 14" laptop and another reviews it on a 27" 4K monitor, the "labeled areas" will drift or misalign because the pixel-to-video ratio is not constant.
Final Design

Reflection
Design with AI Automation
To support officers without AI or data-cleaning experience, the system provides interpretable reasoning rather than raw model output. Each AI-annotated result is accompanied by visual cues and explanations that indicate what was detected and why it was flagged.
Meanwhile, Recognizing that human judgment remains essential, the system provides simple correction and confirmation actions
On-boarding is Important
In the traditional casino environment, the detection is happened when groups of people manually checked camera for possible dealer fault. Most of them don’t have any prior experience in dataset cleanup or AI use. So there should be instruction around the screen when the users first open an interface.

