Strategic tips for engaging with data suppliers, making buying decisions, and dealing with the inevitable challenges.
Summary of key points:
“I don’t trust the data.” It’s an oft-heard phrase in and around biopharma commercial teams, one which suggests a lack of organizational confidence in data. Yet, executives say, the expression is really less about the numbers themselves than the process behind them.
As companies seek to integrate increasingly complex external data into their commercial engines, from real-world to AI-driven insights, gaps in strategy, data integrity, and vendor management are eroding trust — and quite possibly performance.
Data and analytics teams have raised the bar for targeting precision. But without a solid analytics strategy — including a disciplined approach to selecting and managing data suppliers — those sophisticated inputs can quickly turn into expensive distractions that undermine commercial efforts.
The fact is, there aren’t too many new types or sources of data coming down the life sciences pike at present, and the core data categories are stable. These days, the challenge has shifted from finding new data sources to the process of using them consistently and confidently across the organization, says Todd Foster, an associate partner at Beghou.
“It's mostly the same players” in terms of data sets, Foster observes. “There might be more integration and data linking occurring, and companies are making big strides there. But how you’re using it is the key question.”
Asked what’s changing now that underscores the need for more cohesive data application, Foster cites two integration trends — the ability to integrate data with internal systems and the continued effort to build out the HCP customer profile. On the flip side, people are spending so much money on data that companies can and should question whether all the nuance is producing a better outcome.
“At a certain point, there's diminishing returns the more data that you buy,” Foster notes.
Indeed, commercial teams are spending more than ever on sophisticated healthcare data, yet incremental data spend without an effective enablement model creates complexity faster than it creates value. Data/analytics execs rated quality, integration, and trust as the No. 1 “make-or-break factor” to maximize data and analytics functions, according to Beghou’s 2026 Biopharma Commercialization Report.
This highlights the need for a clearer data strategy from day one: Does the plan ensure that data fits the commercial team’s needs, help them to manage transitions among data providers, and/or facilitate the building of AI models around their data?
Even the best-laid plans can go awry, so execs need to continually pressure-test their data strategy and who’s supporting it. Leaders need to think through fundamental questions:
In Beghou’s own research, 35% of biopharma leaders said vendors’ “black-box models or unclear methodologies” make it harder to commercialize effectively. Sixty-three percent cited “overly complex, inflexible vendor solutions.”
By understanding the problem that they want to solve, and the data they’re getting to solve that problem, leaders will be inherently more skeptical of black-box models and less susceptible to naïvely choose a supplier just because they claim to have a proprietary algorithm, Foster says.
Then there’s the matter of switching data vendors, a not uncommon occurrence and one which plans should account for.
“These things are going to happen — data providers are not going to have 100% stable data sources in their underlying collection,” says Foster. “If you know that's going to happen, try to be as stable as you can. It doesn't mean that you shouldn't change data sources, for instance if there is a better one, but you have to be aware of some of the consequences.”
Those consequences should inform the strategy. Include, for instance, measures that enable switching with confidence. In other words, criteria for prospective vendors should include a requirement to provide several years’ worth of data, so that the client can run analytics on it to see what would happen if they switch to that data source.
As an example of why advanced contingency planning could come in handy in those situations, consider a vendor transition and its aftermath. Let’s say a company has chosen a supplier for its commercial data and then built all of its reports. Someone then realizes, “This isn't what we thought it was going to be” and, subsequently, a change in vendor is made.
The numbers from the new data provider may indicate a different signal about the market. Unless that new data and its signal are properly on-boarded, with team members trained to anchor around it as their new source of truth, it’s going to be a bumpy road.
“When you decide on a new data source, there's a lot of change management that has to happen and an inherent lack of trust,” Foster advises. “Depending on how different the data is from what you had purchased before, this means you've created a reason for people to not trust what you've told them.”
To prevent this, Foster recommends the “measure twice, cut once” principle. Choose the data properly, because you don't want to have to revisit that decision (either the vendor, the data, or both), unless doing so is part of your strategy.
Foster highlights other contingencies. He recalls one situation when there was a change in provider or data source, and the new data set was highly skewed toward a certain payer that happened to offer preferential coverage for a specific drug. The company had anchored its leadership around a beneficial 62% market share, but in the newer, biased data set that number jumped to an artificially high 74%.
Once they happen, it’s exceedingly difficult to mitigate those kinds of anomalies.
“It’s possible, through planning, to avoid learning through nightmares,” quips Foster. These can be million-dollar mistakes. “Are you going to go tell your board that you were wrong by millions of dollars?” he asks rhetorically.
Such scenarios speak to the importance of asking the right questions when evaluating potential vendors:
Given how frequently sources and data providers evolve, companies should reevaluate their data strategy every 18 months, Foster urges. This should be done as a conscious, proactive move versus a reactionary one when issues like these arise.
Circling back to the earlier point about trust, what’s really motivating those remarks about lack of data confidence? Reading between the lines, what people are really saying, insists Foster, is that they don’t trust the process.
It's not like people don't trust something that’s recorded. “The issue is rarely the raw data asset,” he explains. “It’s how it’s sourced, integrated, interpreted, and governed.”
The proof, he asserts, is that such issues are rarely confined to one group — either everybody trusts the data or nobody does. The data plan must be inclusive in that it should treat colleagues as business owners, not just consumers who are expected to absorb as truth whatever data points come their way.
“If you don't understand why this insight came to be or how, why would you trust it if it doesn't align with your perceived experience?” Foster adds.
So, to the extent that change is needed in organizational analytics, it’s not necessarily the data itself that needs to change but that there needs to be a more human-centered, thoughtful use of data in applications such as AI. And that comes down to having an airtight, proactive strategy that’s in tune with the needs of internal stakeholders.
As biopharma companies invest heavily in external commercial data and AI-driven insights, many are discovering that vendor missteps, integration gaps, and unclear methodologies can undermine performance just as quickly as they enhance it. Indeed, data mistakes usually stem from unclear strategy, not bad data.
The above examples illustrate the pitfalls of an ill-conceived analytics plan. While leaders may not be able to avoid every one of these issues, they can forestall the need to backtrack on agreed-upon basics, thereby preventing data issues from causing a product launch to go off the rails.
The key is having rules in place for applying data consistently, managing vendor risk proactively, and ensuring the organization trusts the process behind the numbers.
As to the latter point, Foster stresses that the plan should make all team members feel like data investments were made with their business interests in mind.
“Because every internal group can benefit from data, everybody needs to feel you listened to what they want to be able to do,” he says. “So as a collective, you can make the decision that says, ‘We hear you.’”
Recap: Practical tips for commercial leaders