meldCX & Signagelive: Measure the Impact of Your Digital Signage

Published on
March 25, 2022
Raffi Vartian
Vice President, Business Development and Strategic Partnerships
No items found.

A conversation between meldCX & Signagelive, a global leader in the digital signage industry with over 20 years of experience, powering screens deployed across 46 countries with a cloud-based platform.

Digital signage has become an essential communication tool. Dynamic, engaging, and memorable, digital signage technologies have proven to be a strong influence in the buyers’ journey. 

Consider this — 80% of shoppers claim to have entered a store because they got lured in by digital signage. 

Yet, for a long time, the creation of signage content has been a one-sided process. Once the content is displayed, there is little to no way for organizations to truly understand the full depth of its impact on the business; no attribution model.

Enter data-driven digital signage

Imagine if you could anonymously track the glances and view-throughs of your content, or which demographic groups viewed it the most — just like how you would look at Youtube analytics.

Think it isn’t possible? Think again. 

Join Raffi Vartian (meldCX USA VP of Business Development & Strategic Partnerships), Jason Cremins (Founder & CEO of Signagelive) and Tim Baker (Product Marketing Manager of Signagelive), as they explore advancements in digital signage technologies, anonymous audience measurement with AI, and how important content effectiveness tracking is to your overall content strategy. 


<iframe width="100%" height="450" src="https://www.youtube.com/embed/7ARm7YgU-Oo" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>

Featuring:

Transcript:

[00:00:00] Tim Baker: Hello today, we're going to be talking about meldCX and Viana. We've got Jason Cremins from Signagelive and Raffi Vartian from meldCX here with us today. Welcome both. 

[00:00:09] Raffi Vartian: Hello! 

[00:00:11] Tim Baker: Excellent. Well, thank you very much for joining us today. We're going to be going through some of the benefits of the solution that's been put together with Viana and Signagelive. Raffi, did you kind of want to start off with a quick introduction of yourself and meldCX in general?

[00:00:25] Raffi Vartian: Absolutely. Thank you very much, Tim. And thank you, Jason, for having me on. My name is, again, Raffi Vartian and I'm based out of Chicago. I am the vice president for business development, strategic partnerships for meldCX. We have two components of our business. Really. We've got what we call meld core, which is our middleware application.

[00:00:47] For direct device application and peripheral control. And we also have Viana, which is our computer vision solution which we'll be talking about today. And I'm very excited to join you. [00:01:00] 

[00:01:00] Tim Baker: Thank you very much. Jason, did you want to give a quick introduction of yourself as well? 

[00:01:03] Jason Cremins: Yeah. Jason Cremins founder and the CEO of Signagelive.

[00:01:07] And been working with Raffi, Raffi, and I've known each other for many, many. And we're working together to really this is the kind of point at which we want to tell the world what we've been doing over the last 18 months, two years to combine audience analytics with digital 

[00:01:21] Tim Baker: signage. I think things have obviously talking about days is content effectiveness is a term that comes across a lot.

[00:01:26] A lot of the times when we're talking about there. So do, did one of you want to kind of break down what that means and what the benefits really are of the solution that's being put to. 

[00:01:34] Raffi Vartian: Sure. Absolutely. Well content effectiveness is a broad term that I think requires more definition. Specifically as it relates to the digital signage market content effectiveness has been kind of point to point if you will.

[00:01:47] Camera's behind screens in order to showcase the visual attention to an individual piece of content. And breaking down some basic information about the person that's [00:02:00] watching. So we can think about these as basic information about gender or age or attentiveness. It's extremely useful, of course to get this information, you'll see networks all across the world where this is effectively a requirement for anything where there is a financial transaction related to the content that's playing on the screen.

[00:02:23] As you can imagine, there's a lot of desire for folks to have proof that the content has played, which Signagelive takes on natively. And then also combine that with proof that there are folks that are actually watching that content, which is really where content effectiveness comes into play. It's the combination of.

[00:02:42] cameras to validate that somebody is watching with a proof of play that Signagelive already provides. And combining those things together is broadly the term around content effectiveness. What we've been working on is to try to get out from just the point to point of [00:03:00] the camera behind the screen, to take more of a global view of what's going on within a physical environment.

[00:03:07] So that we can start looking at not just someone that looks at the screen, but also what kind of other information are we gathering that may inform that data and make that data more rich? So you can think of this as almost the metadata surrounding the folks that are looking at the screen. And I think it's important to note here, but we'll note again, of course, many times, whenever we talk about this, is that we're completely anonymized.

[00:03:36] We don't take any personally identifying information from the cameras themselves. We do all of our processing at the edge, which allows us to be kind of fully GDPR compliant from the initial installation. So in summary, I'll do a quick version of that, which is that constant effectiveness is tying what people are doing to the content that's playing to [00:04:00] ensure that we've got validation of viewership and then information about who that viewer is and what that viewer is doing contextually within the environment.

[00:04:11] Jason Cremins: I think what we're seeing more and more is the, the the need to truly understand what's happening with the media and the impact of that media with regards to where it's being displayed. There's nothing, you know, for, from a customer's point of view or from a, from a network perspective, you know, how can you quantify the success of a network if you can't measure it?

[00:04:30] It goes for a lot of things in life. So I think from our point of view, the first thing that we do as an organization is to ensure that it doesn't matter, which. The many, many devices that we support, we have the ability to date and timestamp , add the additional metadata, as you spoke too Raffi, and make sure that information is then sent somewhere.

[00:04:47] And in this case, obviously it's the collaboration with meldCX and the Viana platform. It allows us to ensure that data is sent through to the Viana platform and then it can be analyzed. And that's where the [00:05:00] concept effective comes in. It's that marriage between the actual events that happened at the screen and the environment in the store and public place with the content that was playing and what was the impact of the individuals that were looking at that screen? And as you said, Raffi, that starts off with the gender that goes right through to sentiments, glances, all those other things that go on and other aspects within the the environment itself, you know, not just limited to Canada's.

[00:05:25] Tim Baker: Perfect. Thank you. Just for those that aren't already aware of Signagelive and the piece that this is playing in the full solution, could you give us a quick overview of Signagelive and what benefits it brings to the table? 

[00:05:36] Jason Cremins: Yeah, absolutely. I mean, the solution that we're providing is to ensure that we provide a logistics platform for moving media from A to B an ensure it will work okay. And doing that in a incredibly streamlined, incredibly consistent way across all the different device technologies that we have. So we're not mandating to our customers. What particular hardware would particular screens or players they're using an [00:06:00] environment we're trying to allow these kinds of holistic solution whereby from any single point via a web application or a cloud platform, you can upload media or schedule media and get that delivered into the screen. So it's that operational aspect of what we do and then upstream of that is then how do we bring that data in? How do we bring that content in and make sure we create workflows for our customers. That means that there's minimal human touch points from making a decision as to what needs to play to getting that onto the screen.

[00:06:28] And as we just spoke to as well, we then provide the ability to recall that information, what's played back, where it played back and pass it to solutions such as Viana. 

[00:06:37] Tim Baker: Perfect. Thank you. Yeah. One of the things that I wanted to just mention and comment as we go through is obviously some of the things that Viana offers is this glance and view and view through information, which allows you to kind of dictate which content is actually serving its purpose best and actually keeping those glances, which something that in other industries is something that's quite commonplace, YouTube analytics, for example, which is, you [00:07:00] know, you wouldn't be able to make the content that you do on that platform without the feedback that you get. From a digital signage perspective, it's always been a piece that's missing, you know, with proof of play, the data's there in terms of what you decided but the feedback has never been there to say, this is actually what the end result on that content was. So then we always end up circling back to is the content is key. And this is that missing link between the two items that lets you decipher what content is effective and what content is not. Raffi in terms of that, that glance of view and the view through, could you go into a bit more detail on what data is going to be available to make those decisions?

[00:07:36] Raffi Vartian: Absolutely. And thank you very much for that introduction, Tim. And I think it's a really good way to think about it. Thinking about how the signage market has grown from the early days, I would say Jason, not to belie our age or the amount of time that we've spent in this industry. But even in the beginning it was, we don't even know if the content is playing, right?

[00:07:58] We've hit the button. [00:08:00] And there's a question mark at the other end of that button to see if it's actually going on, right? So the first level of validation really is that proof-of-play, right? That's inherent within the Signagelive platform, to your point, Jason, regardless of what end point you're using, whether that be a system on chip screen or a PC or Android device, or kind of anything in between, right.

[00:08:20] From that question mark, once you solve the question mark, the next one is, okay, well, it's playing now. Who's looking at it and who's viewing it, right. So in order for that market to take off much like the signage market was in the beginning, there was a requirement for a lot of computing behind the screen, right. So PC is the first thing in the signage market. And then it went reduced to non-PC devices, then it went system on chip. Same thing with computer vision solutions, right. Big computers behind screens. And then you start to get down to, okay, how do we get these models to recognize that level of [00:09:00] attention down to a place where it's going to be affordable so that we can get mass and we can get scale.

[00:09:05] It's the typical technology adoption curve, right? The first quarter is, you know, people are willing to spend that kind of money, but to get larger adoption, you need to kind of reduce it, okay. So that has led to models that are inexpensive, right. Sometimes can be loaded on a USB stick and kind of put into the side of a screen.

[00:09:26] But the information that you're getting is very, very small, right. Or maybe fairly limited. So you've got this kind of trend in the market where in order to be able to kind of reduce it down to that footprint from a cost perspective and a technology perspective, you've got to have limitations right on the call for a certain amount of time, or only be able to see one person at a time.

[00:09:45] What we're trying to do is marry the same kind of statistics that you would see on like a YouTube, right. Or other kinds of video platforms and try to bring that into the space. So I've got my cheat sheet in front of me. So I'll go ahead and use, [00:10:00] use my cheat sheet. So some of the things that we do from a tracking perspective is that we can look at humans to see, have we actually got the camera to be able to capture them. Can we see that there's human activity, but we're almost in a bit of a blind spot. Whereby we know, based on the distance that they're simply too close to the screen or off to the side, and there's no way to see the effectiveness of that content. So we're not counting those as folks that have the opportunity to see, right. So that's kind of your first baseline metrics. What's the opportunity to be able to view that information, right. And then looking at the information from there, you're starting to look at views. What are the people that start to look up and almost give you a glance at the screen, right. So there, if you look at the proof of play statistics, and then you combine it with the timestamp of what Viana is producing, we can say how long of the 15 second video, let's say, and has the person actually viewed. So the same thing with YouTube [00:11:00] statistics, if you're like me, you mash that skip button as quickly as you can, cause I'm adverse to some of the commercials that come up on YouTube, but you know, you've got folks that maybe look at the lookup because they see flashing lights and then they say, and then they go like this and they look away.

[00:11:14] All right. So we that's, that's a glance. That's not a view. That's not something that should be chargeable if you will, right. And that's certainly a leading indication of what's going to capture people's attention versus kind of view throughs, which gets you through the entire piece of content. And then you've got almost the bounce rate in the middle, right.

[00:11:34] Of people that can glance up, look a little bit, but then look away. And then the folks that kind of view through. So you've got three kinds of data points. I'm sorry. Four data points, right? Including the folks that arekind of there, we can recognize as humans that are close, but not actually looking all the way through, somebody that's viewed the entire piece of content. Those are very important baselines statistics. 

[00:11:57] Tim Baker: But it's interesting you say that because actually the fact that you've got [00:12:00] this anonymous data being collected on view throughs is actually the next level above what, we'll probably end up mentioning a couple of times, but the way that YouTube works, because you can't assume what someone is doing in the house when they're looking at a phone, they could just be planning on leaving, but there's actually adds that layer on top of that, where we go: no, no, we know that someone's glanced at this and this moment captured their attention and we know these specific details about them that's anonymously captured.

[00:12:25] I'm just going back into the signage live part of that ecosystem with proof of play. And Jason, could you just give us a quick overview of where proof of play initially came from. And I guess then how that ends up being part of that transition into Viana. 

[00:12:39] Jason Cremins: Yeah, I think it's kind of said at the outset, there was, there was always a need for organizations, certainly those that are what I consider as media led digital signage network.

[00:12:48] So, so the, the either generation of revenue or the ability to make revenue from media insight or sponsors, or, or just purely from a compliance point of view there was a very early need [00:13:00] for us to provide a platform that that could offer that. And obviously that then very quickly gets to a lot of data being collected, especially if the the, the information is being provided back to us, to our servers in real time.

[00:13:12] You know, we're, we're collecting that at each time the ad plays back or the media plays back. So what we did was to create a complete module, a standalone infrastructure, basically within signage, like for proof of play. And it's an optional platform that our customers can take home should they wish to, even down to the number of individual players that we want to to add that to the Signagelive network.

[00:13:36] Once that data is then within that server infrastructure, we provide some very simple reports ourselves. And, and at that point, really you're just analyzing what played back. And when did it play back. It's nothing related to the audience. That's a completely separate requirement at a separate journey to have to go over in regards of that data.

[00:13:55] And so when we were working with an organization such as meldCX with Viana [00:14:00] platform, what we provide then using our APIs is the ability to tap into that data and to bring that data back into their own platform. So then mash that up as, as Raffi said, previously with the audience analytics that they can.

[00:14:15] And to most importantly, the, the end result of this is what insights am I providing my customer? You know, what insights as a media owner, am I getting, or a network tech, am I getting from the combined information that we're providing? That means I'm going to do something different as a result of that, I might be doing something really well, and I'm going to carry on doing that, that's fantastic. I played well, I chose the right time of the day in the right location to send that media down. But if I haven't and I'm seeing success elsewhere, it gives me that opportunity to pause, reflect on the media that I'm sending down and maybe change my strategy to then, you know, go with the data, go with the insights that have been provided from Viana as a result of the integration sideline.

[00:14:57] Tim Baker: Perfect. Thank you. Yeah. One of the [00:15:00] things that we do a lot here is AB testing of content to see its effectiveness. You know, it's, it's putting in both options and saying which one is actually performing better without this data that is impossible. And it's, you know, becoming more and more of just the typical market trend on anything that you do is, is do several versions and see which one has the best retention or glance or whatever it might be that you're trying to capture. 

[00:15:20] Raffi Vartian: Yeah. And to that point to Tim, it's something that Jason, you and I have discussed I think, you know for our entire careers working together is there's going to be a time and there's going to be a place for a lot of these decisions to be rules-based, right. One of the hardest thing of having a signage network is, you know, feeding the beast with content, right?

[00:15:42] I know that Signagelive is really focused on the ability to play back across multiple environments, a lot of HTML driven content, right? So that the screens are mirroring what can be done and accomplished on the web. And I'm wondering if you can talk on that because I want to key off on that to talk about a [00:16:00] few other things.

[00:16:00] So would you mind discussing that? 

[00:16:03] Jason Cremins: Yeah, no I'm happy to do so. I think we see the biggest challenge for a lot of our larger customers. And for the organizations that we're talking to is, as you quite rightly said, roughly, how do you kind of ensure the screens are playing fresh, contextual engaging content that doesn't just blend into the background, you know, it's relevant at that time to that person or to that particular audience.

[00:16:27] And in order to do that, firstly, you need, you need the data, the previous work and what isn't this. But also is if we decided that there is a need to vary the data and vary that at scale, you very quickly get into a real beast of a requirement in terms of feeding the machine with, with content. So what we're focusing on is how do we provide automation around that?

[00:16:50] How do we replace that manual need? So you create individual assets and content and upload them and schedule them and to build an engine which allows us [00:17:00] to bring in data, bring in imagery, bring in rules and information. And based on metadata, based on the insights that we have is to then allow the Signagelive platform to publish and play that content back based on all those rules or based on the insights that are coming from Viana.

[00:17:16] So I think the, the full conclusion to where we're going to end up with this, that, you know, Raffi and the Viana team are working towards is we know what's played. We know what's been watched. We know the impact of that, as you said, Tim, how do we AB test that? How do we, how do we get, how do we do that?

[00:17:35] It's a cycle, you know, of putting that information together. So the, so the next stage for us is how do we take the insights that's coming from meld and feed our content creation engine. So we can then make those tweaks automatically and based on rules and just keep testing, keep testing until we refine it down so we get the optimum playback of media on any given device at any given time. 

[00:17:55] Raffi Vartian: That's fantastic. Thank you, Jason. So the initial component of what we're working [00:18:00] on is the tie between: is the content effective, right? Any of the conversations that we're having from a analytics perspective, the computer vision analytics perspective is let's: measure what's going on first because we need a base.

[00:18:16] I always joke about this kind of in a medical environment where you go for your annual checkup to the doctor, and I've got family members who make sure that they eat very, very well for the two weeks before they go get their checkup, right. And then they get the one checkup per year, and then they come back in another year.

[00:18:31] That's not a lot of baseline and it's artificial in a lot of ways. So we want to ensure that we understand everything from a baseline perspective, from an analytics, and then we can build ideas on how to make that better. And to get kind of that ground truth first, and then we can say, okay, based on the ground truth of almost nothing happening, right on from a signage perspective, then we add, then we see what happens. 

[00:18:54] Or we've got existing content. Here's your baseline. What happens then after we make some adjustments, right? [00:19:00] One of the things that we talk about a lot is the idea of personas. So the ability to look at male, female, age information and sentiment, or kind of baseline statistics now, but now we're starting to work with customers that have upwards of 70 or 80 separate personas.

[00:19:23] So if you think about kind of those characteristics and how you bring those, all those things together, it's, it's a lot, they've created those personas because not the brick and mortar folks have created those personas, but it's the online, it's the mobile folks that have created those. And those personas are based on the incredible amount of data that's gathered from your web activity and your mobile activity effectively, right? So in a lot of ways, we're trying to elevate the brick and mortar experience to be able to match what you can kind of collect from online and kind of mobile, right? And so the initial stage is [00:20:00] gathering that data and looking at the effectiveness of that content. What works, what doesn't work, how do we tweak things? And then how do we measure against that baseline? 

[00:20:09] From there then we start to look in, we can start establishing rules to say... 'if, and, but, then' statements. If this happens, but this , and how do we stack. So utilizing the tools at Viana, the tools at Signagelive, marrying them together to be able to create those rules and then eventually get to the place where we're triggering content and we're affecting the content that's on the screen based on the rules that we've set up within both systems.

[00:20:34] Tim Baker: Yeah. That makes perfect sense. And it's an exciting point in time to be getting to this point, because I think that conceptually, the idea has been there, but it's about actually delivering this in a way that's manageable. 

[00:20:45] When you're talking about 70 different personas, the first thing that you think is: I can't manage 70 different personas worth of content, you know, it's just not possible.

[00:20:52] So it's more about how do we then take that. Then those data ranges, those data sets and say, what piece of those personas is important to focus in on with [00:21:00] content and instead automating that and creating a system, that's going to be able to ingest that and start making decisions effectively is, is kind of, where this will kind of end up getting to. And one of those questions that got raised. Cause you mentioned brick and mortar stores there. And in terms of the benefits this is bringing, what verticals, you know, what, what's the kind of approach who do you see this being most beneficial for? And who is it targeted towards?

[00:21:25] Raffi Vartian: Oh, that's a very big question. Tim I'll do the best that I can. Of course we start in retail, right. That is kind of the bread and butter, I think for most of these applications and where we find the most comfort for the customers to be able to have these types of conversations, because they have to, right.

[00:21:43] The retailers, if they don't understand their customers, fundamentally, they're not going to be successful at their core business, which is driving people into the environments, delighting them, and therefore driving sales and driving repeat visits, right. So first and foremost, it's about retail. Obviously the digital out-of-home [00:22:00] networks, this has been more and more of a requirement. And I won't go into the, the sorted past of what digital out-of-home used to look like 15 years ago. But it's certainly very different today where the minimum bar of entry is some level of proof, some level of proof that you've played some level of proof that somebody viewed it.

[00:22:19] And, and to get that engagement, this is this statistics. I think that the digital out of home societies have done extremely well of trying to talk about the importance of data, and gathering that. And so that, that becomes popular in some, just like the zeitgeists of the digital out of home community.

[00:22:38] There's a, an assumption that you must measure. Right? So I would say retail, number one, digital out of home, number two. But what we are finding is that the contextual data that we get is going to be important for places where you wouldn't think that signage has a play. So these are things like [00:23:00] warehouses and manufacturing environments to be able to provide health and safety information on an ongoing basis.

[00:23:07] We talked earlier about kind of point-to-point camera behind screen. But if we look at the ways people are interacting within a warehouse, and if they're getting too close to a forklift, or if they've done something wrong, they've gone into an area that's not right, you can do a couple of things.

[00:23:25] You can reinforce messaging on screens to be able to say, is this going to make a difference in the overall behavior of an environment and then start to trigger content in combination with other things like audio alerts . To be able to make that whole. Environment more kind of interactive and automated because it's, it's a place where there's not a lot of technology that's gone in.

[00:23:47] So retail is starting to really be the infrastructure is starting to be there. Our biggest challenge, you know, 10, 15 years ago, was 'is there internet in the store? Right? [00:24:00] Now, that's, you know, a lot of these POS systems are running off of, you know, consumer level devices, right. Apple's going into being able to push Square aside and be able to take payments directly on phones, for the smaller retailers.

[00:24:12] So you're starting to see infrastructure kind of go away. And there's a lot of comfort within that environment. But the growth could actually be in the places like warehouses, right? Like large facilities where you need to not only understand the person that's looking at the screen, but you need to understand all the environmental factors that are going on to provide enough context to inform the content and trigger the content that's going on on the screen.

[00:24:38] Tim Baker: Perfect. 

[00:24:38] Jason Cremins: Yeah, I think, I think that's the area that excites me the most is the aspect of efficiency - compliance and efficiency, I think are the two big areas because as you said, you know, whether it's, it's in a warehouse for health and safety, whether it's in a production environment where, you know, a mistake packing something could result in, you know, a whole shipment being incorrect and being sent back and the ongoing cost or, or, you know, [00:25:00] it could result in some form of legal action because they packed the wrong product into the wrong packaging.

[00:25:05] And it's gone off to the customer. And they've taken you know, rightly so taking that to the price of what have you, because they feel as though there's, there's some damage that can be done from that. So I think some of those kinds of insurance policies that come around having a solution like this that we've discussed Raffi, I think is great.

[00:25:19] And the ability to have the displays as an extension to that, to go, is it either in the environment be changing to evolve to what's happening, be careful next time, but also that, 'Hey, hang on, you've got to stop that.' You know, we need to, we need to pause. We need to reflect on what's happened here when he's called the supervisor.

[00:25:36] Yeah. Screens can have a big part to play alongside the work that you're doing to, to have that immediate trigger, as well as that kind of, I suppose, that, that soft trigger, that ability to go do this next, as opposed to do this now. And I think that's how we see the two elements. 

[00:25:51] The other thing I think is worth talking to as well, Raffi, for those that are, have concerns. And we've covered the fact that we're using models, we use the [00:26:00] personas, we are using, you know, data that's actively encrypted. It's not recognized to any one individual. If people want to steer away from faces, could you talk a bit about, you know, the other aspects of what you can use with Viana in terms of tracking, you know, in an environment, you know, whether it be product or logos or other things.

[00:26:20] Raffi Vartian: A hundred percent. And thank you for that. And I also wanted to have a bit of a conversation after we do this point about secure dashboards, about how we display data right back, right. And how the secure dashboards give the people that are managing those kinds of networks, right. Or managing a facility, a big leg up, right. 

[00:26:38] To be able to have that, all that information authenticated. Let's talk a little bit about the for Viana. Then I want to talk a little bit about how. And then we'll talk about the kind of outcomes, right? So philosophically the problem that people have with cameras and artificial intelligence is that there have been lots of horror stories within [00:27:00] the public and within the press for models that have not thought carefully about the ways that they're implementing technology.

[00:27:12] So, the over-reliance on facial recognition in this industry is exceptionally problematic. That's why you're seeing California with privacy laws. GDPR is really making big headways as everyone can read the news about Facebook and all the rest of that, right. About Apple's saying not, not track, right. What I mean by this is that Viana has not utilized the models that are coming from the large technology companies that are p roblematic by nature. And they're problematic is because they've collected information. When you signed on to, to go to those free networks or upload your photos, right.

[00:27:56] And they say, well, turn on facial recognition so that we can group things together [00:28:00] when you click yes. That you will allow them to gather all that information and create facial recognition models. Right. So what we do is we don't look for the face. And except for basic information about age sentiment, gender that we talked about before and directionality, we don't look at the 62 points of data that creates a facial recognition ID effectively, right? 

[00:28:27] There's a lot of reasons behind that. But first and foremost is that we don't believe it's necessarily that valuable, right. There's more important things that we can learn about human activity that will number one, make the customer more comfortable. The folks that are going to buy it, the retailer or whomever. And then the patrons within those environments more comfortable as well to say that there might be cameras here, but we don't know who you are.

[00:28:50] We're looking for things that are contextual. So if you think about web statistics. If somebody knew something about me, they would know, you know, generalized age, kind [00:29:00] of, you know, location. We don't know location information outside of brick and mortar, but also the things that I like, right. You know, and what we have found is the things that people wear says a lot about them, right?

[00:29:11] We all, there's a reason why there's trillion dollar fashion industry or whatever the number is, is because the things that we put on our bodies, when we go out to kind of meet the world, says a lot about who we are. So we can look for things like logos. We can look for things like color. We can look for things like airpods in the ear, right.

[00:29:30] And it tells us a lot about who the persona is of the individual. So I don't really care that it's Raffi, right. But what we do care is that the person may have a little bit of trouble with their eyes. They like apple, right? Because they've got AirPods in and that data is more valuable than knowing who the person is.

[00:29:49] There's too much risk in the person. There's much more value in knowing contextually what that persona is. So we, that, that's what we're focusing our efforts on, our efforts are on [00:30:00] effectively tracking the metadata around an individual, as opposed to the individual themselves, because that metadata feeds the persona.

[00:30:08] That persona feeds the data that data enriches the environment and gives the decision-makers better quality to be able to make their decisions on. 

[00:30:19] Tim Baker: Well, thank you very much, Jason, Raffi, for joining and having this very interesting conversation about kind of the, the environment as it stands and what this brings to the table and benefits. Any kind of final thoughts from either of you, of, you know, the total offering and what this will deliver.

[00:30:33] Jason Cremins: Yeah. I'm happy to take that. I think from, from, from Apple to view, it's the, it's the first time that we felt as though we've got a solution that's covers all the needs for I could really get the the most out of the group of like data that we're able to generate. And most importantly, it's, it's going to provide, those that are looking at large scale deployments of digital signage, and obviously the not insignificant expense that comes with that. That peace of mind, that insurance policy, that as they roll out their technology to the concept that [00:31:00] they're deploying is actually being effective and they're getting the best possible return from their additional signage solution.

[00:31:06] Tim Baker: Perfect. Thank you, Raffi. Anything else you want to append to that? 

[00:31:09] Raffi Vartian: Well, I, I would like to finalize by saying that it's a homecoming of sorts for me. So it, it is personally satisfying to be able to, kind of, do this work. But it's not about personal relationships as much as it is beneficial to have a personal relationship. That's not what this is based on. 

[00:31:27] This is based on the right technology at the right time for the customers that suit their needs and allow them to have access to data, but be able to set up rules. In order to get their jobs done, because if we don't do our jobs from a technology integration perspective, this is technology for technology sake, it doesn't solve any business problems.

[00:31:50] So I'm excited to take it out to the market to see what business problems that we can solve for our customers. 

[00:31:55] Tim Baker: Well, thank you very much, guys. I really appreciate your time. If you have any other questions around this and what [00:32:00] you've heard and you want to learn more, feel free to reach out to us wherever this video is embedded.

Otherwise, thank you very much for joining us!

Latest from meldCX

meldCX's Vision Analytics Solution Showcased at TD Synnex Showroom in Munich, Germany

by
meldCX
Mar 19, 2024
2 minutes
Featuring our Content at the Right Opportunity (COATRO) solution for digital signage. In partnership with Intel and Signagelive.

Get the latest meldCX news and insights right to your inbox!