Ten Perils of Research-Based Content

Research is a brilliant way to demonstrate thought leadership, start a conversation and achieve your positioning and profiling objectives – so long as your methodology, analysis and commentary will withstand the scrutiny of your stakeholders.

On 11 October 2015, Brand Finance released its 2015 Nation Brands report, an annual study which indexes nations based on the six dimensions (exports, governance, culture people, tourism and immigration/investment) that contribute to national brand image. The research methodology is purportedly complex, which implies it takes time to conduct and analyse.

The 2015 report was released under a press release which linked a key result – namely, Germany’s ranking – to the Volkswagen emission scandal: VW scandal loses Germany top spot as most powerful nation brand – Brand Finance.

Yet, according to The Guardian, the VW scandal did not become public until 20 September, and then took several days to unravel.

Noting the complexity of the methodology, I challenged Brand Finance about the timing: How was Brand Finance able to measure the impact of a single brand in crisis on its country of origin within just over two weeks? I was advised a ‘review’ had been conducted after the research period closed. My other questions went unanswered and I was left unconvinced.

One doesn’t need a degree in quantitative analysis to know the scandal will leave a measurable mark upon VW’s marque for years to come. Or to surmise that brand Germany will suffer collateral damage. But to me, associating the 2015 Nation Brands results with the crisis was little more than a cheap PR grab. Not only did it cause me to question the integrity of the research, it also left me with a negative perception of Brand Finance.

To me, associating the report with the VW emission scandal was little more than a cheap PR grab.

 

Fast forward a few months and LexisNexis has just released Law Firms in Transition: Marketing, Business Development and the Quest for Growth. Based on two rounds of research conducted in late 2015, it’s a schmick-looking report. And yet, while the initiative was good, the execution was so poor that the investment was wasted.

Firstly, important respondent profiling data was not captured, or was omitted from the report. This raises questions about the credibility of the research, and negatively impacts how it is perceived. For example:

  • There’s no indication of the seniority of the respondents. Were they coordinators with five years’ experience? Directors with 15 or more years’ experience? Were they individuals who are in positions that grant them access to strategic discussions, financial information and the like?
  • There’s no breakdown of the respondents by firm size or structure, yet we all know the organisational structure of a single state healthsavy.com practice will differ vastly from that of a federated national firm, or from a financially-integrated global practice. In a one state firm, it might make sense for same person to head both marketing and business development; in the latter two examples, probably not. Similarly, large firms tend to employ specialists; smaller firms, generalists.
  • Perhaps most importantly, how did the respondents distinguish between marketing and business development? A key finding of the report is that marketing and BD are separate functions and require different skill sets. Yet, other than a passing reference to firms being “clear on the lines of demarcation”, no explanation is offered. That demarcation needs to clarified in order to identify the skills which are common or peculiar. In law firms, and other professional services categories, ‘business development’ can mean everything from competitive proposals to cold calling. I can’t help but wonder if this survey simply gave voice to a whole lot of law firm marketers who “don’t want to do tenders”.

Putting the sample and respondent data aside, the survey analysis and the way it is reported by LexisNexis is also questionable. One example: the report states the top two areas in which law firms plan to increase or significantly increase their investment are:

  1. Technology Tools (CRM or marketing automation) and
  2. Client/Prospect Analytics.

That statement, though convenient (the sponsor of the research is InterAction), is simply false for two reasons:

Firstly, when you combine Significantly Increase and Increase, Thought Leadership ranks highest not Technology Tools (see graph below).

Here are ten ways you could expose the weakness of your research-based content
Secondly, the list of tactics presented by LexisNexis is random, contains duplications, and is ambiguous, rendering the data corrupt. Here’s one example:

  • ‘Content marketing’ is, by definition, “a strategic marketing approach focused on creating and distributing valuable, relevant, and consistent content to attract and retain a clearly-defined audience — and, ultimately, to drive profitable customer action.”
  • In a law firm, the valuable content which is being created and distributed is in the form of articles, bulletins, newsletters, blogs, webinars and so on. It’s what lawyers – and law firm marketers – have traditionally referred to as ‘thought leadership’.
  • Consequently, had Blogging/Content Marketing been bundled with Thought Leadership, it would have been the standout area of heightened investment and the key message would more accurately have been something like “Law firms discontent with their content investment”.

I could go on but there’s little need.

While the initiative was good, the execution by LexisNexis was poor and therefore the investment was wasted.

Research is a brilliant way to demonstrate thought leadership, start a conversation and achieve your positioning and profiling objectives – so long as your methodology, analysis and commentary will withstand the scrutiny of your stakeholders.

These two examples do not.