Last Updated: 04/9/2024
Digital signage has, until recently, always had a fairly big blindspot. Conceived in the world of advertising, but more recently adopted by industries like education, retail, hospitality and places of worship as a means of refreshing and visualizing key information, we all perceive the power of digital screens, but can we prove it? Digital signage as an industry has only recently begun to ask:
“How do I know if anyone is actually watching my screens?”
ScreenCloud’s CEO Mark, often compares digital signage now to early web experiences. For those of us working here who are a bit too short in experience to remember, let’s recap:
Bounce Rate, Attribution, and Behavioral Flow are now essential words in any self respecting digital marketer's vocabulary. But more importantly, modern websites are rarely altered today unless the decision you want to make is supported by data. Choosing how to design a new landing page, where to send the user next or how to rejig your site hierarchy only comes after a good long look at Google Analytics, A/B testing platforms and other tools that tell you how your users are interacting with the site today.
Compared to the reams of data produced by websites, how many digital signage deployments can tell you the dwell time around each location, how many views there were per content piece, or how many viewing opportunities that screen generates a week?
It seems very few. Deciding what content to show and when on digital signage screens to date, (excluding conventional media buying channels here, where the content schedule is optimized algorithmically), is the equivalent of licking an index finger and holding it to the breeze.
So here at ScreenCloud we set out to perform a brief experiment, to answer the perennial question:
“How difficult is it to see who’s actually engaging with the digital signage content shown?”
Handily, in the modern era of SaaS, there are several analytics vendors that offer tracking methods such as footfall tracking, iBeacons (a kind of NFC which can tell when a smartphone is in range), facial tracking, and more.
Having reviewed a variety of vendors and methodologies, we settled on using a mutual Intel partner called Quividi, who produce anonymized facial tracking data using webcams. This was an important distinction for this initial experience, as we didn’t want to use facial recognition to check who is watching (usually based on demographic, ethnicity etc.), we just wanted to check if people (anyone) was watching.
Our setup consisted of mounting an Intel NUC behind the 4x1 video wall setup in front of the elevators on our floor, with a webcam mounted just underneath. Once we’d ensured the data was uploading correctly and the power settings were correct on the NUC, we simply left it running for an extended period of time.
You can see our webcam just below the centre right monitor:
This would monitor:
Over the 2 month duration of this experiment, we had some key learnings:
Over the average week, we saw an OTS of 4,002, and 1,942 Watchers, expressed as a conversion ratio of 48.5%.
In other words, every second person who walks past the screens, stops to look at them.
We learned that people standing in front of the screens, and people watching the screens, weren’t necessarily the same thing. For example, people might have meetings in that area or be looking at their phone, which is an important distinction to make, as many experiments would confuse passers-by with actual watchers.
On the average week, there was 5,000 seconds of total attention time, compared to 13,000 seconds of dwell time. This is expressed as an attraction ratio of 38.4%.
This implies that roughly ⅓ of people’s time spent in front of the monitors is spent watching.
Quividi gives all the above metrics broken down by day of the week, and hour of the day. Using this, we gained far greater information on the optimal time to impact.
Statistically, the days where the screens made the most impact were Friday (50.59%), Thursday (49.82%) and Monday (48.43%).
Friday, stereotypically the least productive work day, produced the greatest conversion and attraction ratio. Despite nearly 100 less OTS, people watched the screens almost as much as Tuesday, and for more time than any other point in the week.
This initial experiment gave us some great insight into our own internal communications environment, and has triggered further thought and experiments with how we can better optimize the scheduling of content. There is still more work to be done, in order to reduce the amount of educated guesswork needed when determining what time of day is best to schedule content and that’s something we’ll strive for in the future for sure.
In particular we’ve gained clarity on:
The next step for us is to integrate a solution like Quividi with the ScreenCloud platform, to give our customers more detailed information on digital signage views.
All of the above begins with analytics, and knowing if people are watching. The answer, for at least half of us here at ScreenCloud, is yes.
To find out more about ScreenCloud and our work towards better digital signage content, check out our blog or feel free to contact hello@screencloud.com
ScreenCloud works on any screen, TV or device, and there's no need to give us your credit card details.