Making the Case for Open Benchmarking

Jeff Hastings, CEO, BrightSign

There’s a reason highly sought-after integrators and technical consultants are well paid for their services. One of their roles – among many others – is to understand what their customers are hoping to achieve with their digital signage and scope out an ecosystem that supports those goals. To do this well takes an immense knowledge of the various software platforms and hardware components that comprise a well-oiled network. Interoperability is never guaranteed, and vendor-claimed performance data rarely mirrors real-world performance.

A number of factors impact the performance of any AV network. Basic hardware/software interoperability plays a role, but one of the most common sources of network fatigue comes from the content itself. File size and format are important variables, as are the I/O protocols used to distribute that content throughout the network. But unless a specific network is staged and tested prior to deployment (which we highly recommend), there’s no way to know exactly how a digital signage network will perform until it’s up and running.

If pre-deployment staging is not an option, the next-best solution is to study reputable benchmark test results. Benchmark testing makes it possible to compare hardware performance on a wide range of content sources. And while this sort of testing won’t perfectly replicate real-world network performance, it is a great way to have an apples-to-apples comparison of how various hardware devices handle a wide range of content sources.

We believe so deeply in the practice of universal benchmarking that we developed the Digital Signage Benchmark back in 2018. The idea was to create a platform-agnostic measurement tool to let people see for themselves how different media players process graphics, video and HTML files, making it easy for anyone to run a series of industry-standard, proven, independent third-party tests on any device with just a browser and an internet connection. The DSB includes not just performance data, but also a cost-comparison that lets people consider the cost/performance value of each player. We feel this transparent approach to exposing exactly what the third-party tests and testing methodologies are – and empowering anyone to test for themselves – is the only way for a benchmarking platform to have widespread credibility.

In contrast to the Digital Signage Benchmark, there’s been some recent buzz in the industry about certain vendors citing benchmark performance results while cloaking the actual data behind those claims. Which begs the question, what value are those results if the data and the methodology aren’t openly disclosed? Are we expected to take these benchmark claims at face value? It’s a slippery slope when vendors start making unsubstantiated performance claims. At BrightSign, we prefer full disclosure of what’s being tested, and how it’s being tested. Anything short of open transparency renders a benchmark test uninformative and may rightfully be interpreted as intentionally deceptive.