Navigating Retail & Hospitality Analytics with Confidence
Why Viana's Privacy-First Approach is Not Facial Recognition
The landscape of Australian retail technology has shifted significantly over the last year. While the 2024/2025 Kmart determination first put a spotlight on how customer data is handled in retail spaces, the early 2026 Administrative Review Tribunal (ART) decision regarding Bunnings has further refined the "high bar" required for the use of biometric technology.
These cases have sent a clear message: while technology in retail is evolving, the protection of personal privacy is non-negotiable.
We see this as a great opportunity to clarify the crucial differences between technologies.
To be clear from the outset: meldCX’s Viana™ is not Facial Recognition Technology (FRT).
Viana is built from the ground up on a ‘Privacy by Design’ foundation. Unlike FRT, which captures sensitive biometric data to identify who a person is, Viana is designed to understand anonymous behaviour—how many people are in a space and where they go.
We achieve this through immediate, on-device anonymisation, a process that ensures no personal or biometric data is ever captured, stored, or transmitted. This keeps our solution aligned with the Australian Privacy Act, the GDPR, and the upcoming transparency requirements for Automated Decision-Making (ADM).
The rest of this article takes a deep dive into how our technology works, how it aligns with the 2026 regulatory climate, and why our approach offers a powerful, ethical, and future-proof solution for modern retail and hospitality settings.
Demystifying the Technology: What is Facial Recognition (FRT)?
At its core, Facial Recognition is designed to identify or verify an individual. It works by capturing an image of a face, converting that image into a digital "map" or biometric template, and then comparing that template against a database of known individuals.
Under the Australian Privacy Act, this biometric template is classified as "sensitive information." The 2026 Bunnings decision confirmed that even when used for legitimate security purposes, the collection of this data requires a rigorous demonstration of proportionality and clear, prominent notice to consumers. For many retailers, the compliance burden and potential for brand damage make traditional FRT a high-risk path.
The Viana Difference: Understanding Behaviour, Not Identity
Viana takes a fundamentally different path. Our technology does not "recognise" faces; it detects "human forms" and "behavioural patterns."
- On-Device Anonymisation: Viana processes video frames in real-time at the "edge" (on the camera or a local device). Before any data is ever saved or sent to a dashboard, it is converted into mathematical coordinates that represent movement and dwell time. The original visual data is instantly discarded.
- No Biometric Templates: Viana does not create or store a unique "face print." It cannot tell if the person who walked in today is the same person who visited last Tuesday.
- No "Sensitive Information": Because Viana never identifies a specific person, it does not collect "sensitive information" as defined by current Australian law. This means retailers can gain deep insights without triggering the restrictive compliance requirements recently highlighted by the OAIC.


Future-Proofing for Late 2026 and Beyond
We design our technology not just for today’s regulations, but for the reforms already on the horizon.
Automated Decision-Making (ADM) Readiness
The Australian Privacy Act amendments taking effect in December 2026 will require businesses to be highly transparent about how they use AI to make decisions that affect individuals. Because Viana focuses on aggregated, anonymous analytics rather than individual profiling, our partners are uniquely positioned to meet these new transparency standards without overhauling their existing systems.
The EU AI Act (August 2026 Milestone)
The EU AI Act is the world’s first comprehensive AI law. By 2 August 2026, "high-risk" AI systems—which include most forms of real-time biometric identification in public spaces—will face stringent registration and audit requirements. Viana’s focus on anonymised analytics ensures we operate outside of these high-risk categories, providing our global customers with a powerful, future-proof solution that bypasses these heavy regulatory burdens.
Ethical AI: Synthetic Data Training
Viana’s commitment to privacy extends to how we build our software. Our AI models are not trained using photographs or videos of real people. Instead, we use our proprietary 3D engine to create synthetic data. This allows us to generate millions of diverse scenarios to train our models effectively, removing demographic bias and protecting individual privacy from the very first line of code.

Conclusion: Gaining Insight Without Compromising Trust
The regulatory landscape in 2026 is sending a clear signal: the use of technologies that capture sensitive biometric information faces intense and justified scrutiny. However, the need for retailers to understand their physical environments to improve service, layout, and efficiency remains as vital as ever.
Viana proves that these two realities are not mutually exclusive. It is entirely possible to gain powerful, data-driven insights without compromising customer trust or breaching privacy regulations. Viana offers a compliant, ethical, and effective alternative that delivers profound business value while fundamentally respecting the individual’s right to remain anonymous in the physical world.
