This is the second in a series of posts about the how the brands of broadcast technology vendors were ranked by respondents to the 2010 Big Broadcast Survey (BBS).
Each year as part of the Big Broadcast Survey (BBS), a global sample of broadcast professionals are asked to rank their opinion of a number of technology vendor brands on a wide range of metrics. This information is used to create a series of reports, which through benchmarking and industry “league tables” enable these vendors to understand their competitive position in the market.
More than 5,600 people in 120+ countries participated in the 2010 BBS, making this the largest ever and most comprehensive study of the broadcast industry. In addition to measuring a variety of broadcast industry trends, more than 100 vendor brands (in 27 separate product categories) were evaluated by respondents.
Recently, I discussed how respondents to the 2010 BBS ranked The Top 30 Broadcast Technology Vendor Brands by Overall Opinion, Ranked, Globally and Regionally
Appearing in the top 30 of an overall opinion poll is obviously a good place for any vendor to be, but this only scratches the surface of how the market views a brand.
While indicative of the market’s view, these overall opinion rankings are presented as a snapshot in time. They also provide a somewhat one-sided view of how brands are regarded because they take only positive perceptions into account. In order to get a better understanding of how broadcast technology vendor brands are perceived, it is necessary to look at both the positive and negative opinions of brands, and to take into account how these opinions have changed over time.
One way to do this is to ask people who have an opinion of a brand, how their opinion of that brand has changed over time – i.e. has it improved, declined or stayed the same.
When you do this, you can get some interesting results. It turns out that some brands are more polarizing than others, with different respondents having very different opinions. For example, here’s a chart from the 2009 Big Broadcast Survey.
Notice that in the above table, the company that was ranked #1 for “got better” also ranked #1 for got worse.
Given these results, it is perhaps more useful to calculate the Net Change in Overall Opinion for each brand, which is calculated by using the following formula:
GB-GW/# of total respondents = Net Change in Brand Image
In other words, the percentage of respondents who said a brand “got worse” is subtracted from the percentage of respondents who said their opinion of a brand had “got better” (ignoring the “stayed the same” number).
This takes into account both the positive and negative perceptions of brands, along with how these opinions have changed over time. It also presents a more balanced view of which brands are getting better and which are getting worse in the minds of market participants.
Because some brands are polarizing (as seen in the example above), it’s possible that a strong “got better” response might be cancelled out by a strong “got worse” response. As a result some companies who were rated in the top 30 on just the “got better” score were not included in the global or regional top 30 because their high “got worse” score dragged down their overall result. At the same time, a few of the companies with high “got worse” scores still made the top 30 list because these negative scores were cancelled out by even higher “got better” scores.
In order to arrive at the Net Change in Overall Opinion, research participants were asked whether their opinion of various brands had “got better”, “got worse” or “stayed the same” over the past 2-3 years.
The results of this enquiry are shown below in two ways:
- An overall industry “league table” that shows the 30 highest ranked vendors for the metric “Net Change of Overall Opinion.” The data in this chart is broken out globally and regionally.
- An analysis of the “frequency” of appearance in the “Net Change of Overall Opinion” league table.”
The top 30 ranked brands for Net Change of Overall Opinion are shown below for both the global sample of all respondents as well as for all respondents in each of the geographic regions.
In all cases, these results are shown in alphabetical order, NOT in the order in which they were ranked by respondents to the survey.
Question: Has your opinion of the following brands improved or declined over the past 2 years in relation to the broadcast technology products / services they provide?
Interestingly, a total of 65 broadcast technology vendor brands are included in this table, demonstrating the strong variation in opinion change based on geographic segmentation of respondents.
In terms of frequency of appearance in this table:
- 3 brands appear four times, meaning they were ranked in the top 30 globally and in each geographic region. It is possible
- 10 brands appear three times
- 26 brands appear two times
- 26 brands appear once, which demonstrates that some brands are strongest in one geographic area
Analysis of the data shows that are some clear market leaders on a global basis, while others are strong on a regional basis.
A breakdown of how many times each company appears in the ranking shows how many times each brand appears in the chart above.
Brands appearing four times:
- Barco, IBM, Ikegami
Brands appearing three times:
- Avid, Chyron, For-A, JBL, JVC, Mackie, Motorola, Siemens, Telex, Yamaha
Brands appearing two times:
- AKG, Audio-Technica, Axon, Dayang, Dolby, Echolab, Electro Voice, EMC, EVS, Fujitsu, Grass Valley, Harmonic, Harris, Klein + Hummel, Orad, Pesa, Pharos, Quantel, RTS Intercom Systems, SeaChange, Shure, Snell, Solid State Logic, Sundance, Tandberg / Ericsson, Tektronix
Brands appearing once:
- Accenture, AMS-Neve, beyerdynamic, Dalet, Evertz, Focal, HP, KRK Systems, Leader Instrument, Marshall Electronics, Miranda, Net Insight, Neumann, Omneon, Omnibus, Pilat, Pixel Power, Quantum, Rohde & Schwarz, Ross Video, S4M, Screen Service, Sintecmedia, Utah Scientific, Vizrt, Wheatstone
Analysis of overall opinion by region:
The table below shows the global and regional performance for each brand in the top 30 ranking of overall opinion.
The frequency chart shows some interesting geographic variation in the data, which is highlighted below.
Interestingly a the following 13 appear in the top 30 Net Change in Overall Opinion for the global sample, but not in any of the regions.
- Accenture, AMS-Neve, Focal, KRK Systems, Leader, Net Insight, Omnibus, Pilat Media, Pixel Power, Quantum, Sintecmedia Utah Scientific, Wheatstone
There a number of possible explanations for this. For example these companies may have fared well in each of the regions, but not well enough to make the top 30. However when all responses are aggregated, there positive data propels these brands to the top 30 on a global basis. It is also possible that these brands scored well on a regional basis, but that the regional sample was insufficient to be included in the regional rankings.
All regions, but not global
Interestingly, for four brands the converse of the above also occurred – i.e. these brands made the top 30 list for Net Change of Overall Opinion in each of the three regions, but not in the global sample.
- Avid, For-A, JBL, Yamaha
Again this is due to a variety of factors including the aggregate strength of certain brands, coupled with sample sizes.
Global + one region
Nine brands managed to achieve a top 30 ranking in the global Net Change in Overall Opinion league table, despite being in the top 30 of only one of the three geographic regions.
- Dayang, Echolab, Electrovoice, Fujitsu, JVC, Motorola, Pesa, Quantel, Sundance
The following brands did not make the top 30 in the global league table of overall opinion, but they did appear in the top 30 overall opinion ranking in one of the geographic regions:
Beyerdynamic, Dalet, Neumann, S4M,
Evertz, HP, Miranda, Omneon, Rohde & Schwarz, Ross Video, Screen Service
Marshall Eelctronics, Vizrt
Please keep in mind when reviewing this information that all data these charts are presented in alphabetical order, not in the order brands were ranked by respondents to the 2010 BBS. Also, the charts in this posting measure the responses of all 2010 BBS respondents, regardless of their company type, company size, geographic location, job title and budget for broadcast technology products.
In order to get full value from this data, it is necessary to evaluate these results on a granular basis. If you would like more information, please contact Devoncroft Partners.
This article is based on the findings from the 2010 Big Broadcast Survey (BBS), a global study of industry trends, technology purchasing behavior and the opinion of vendor brands. With more than 5,600 people in 120+ countries participating, the 2010 version of the BBS is the largest and most comprehensive market study ever done in the broadcast industry.