Lessons learned from building data-driven product teams at Bloomberg Industry Group

7 min read
Share on

Product teams are increasingly embracing a data-driven approach to craft exceptional customer experiences. We had the opportunity to engage in a conversation with Kayode Dada, the Vice President of Product Experience at Bloomberg Industry Group, to understand his perspective on how organisations can adopt a data-centric approach to building great products.

Bloomberg Industry Group empowers professionals in government, law, tax, and accounting with industry knowledge and AI-enabled technology. The company enhances its clients’ productivity and efficiency through a continuous commitment to innovation and an unmatched mix of workflow tools, trusted news, and expert analysis – all delivered through cutting-edge technology platforms.

Kayode oversees product design (UX/UXR), product analytics, product operations and product management (News, Platforms and CRM) teams. He has recently embarked on a transformative journey to make product teams more data-driven to deliver exceptional experiences for Bloomberg Industry Group’s customers. He shared insights into this process, discussing the strategies implemented, the challenges encountered, and the valuable lessons learned throughout his journey.

A clear outcome for the feature/product you’re developing helps the team to stay focused on resolving the inherent friction between the user and the desired behaviour change. Data and insights help product teams to know if they’re progressing towards that goal and what changes/experiments they should attempt next. A clear outcome also helps the team counter the dominance of stakeholders in decision-making and ensures that feature/product performance data guide product strategy.

Transitioning to a data-driven approach is like any other transformation effort; we were challenged by resistance to change, especially because we have successful products that have performed well until now without this approach.

We have had to deal with cultural adjustments and make this new process a part of our development work. Data literacy has been another challenge, and of course, ensuring that we have the right tools for the product team to gain insights from data.

Overcoming these challenges has meant educating teams on various metrics frameworks and helping them use them to understand product performance.

We have also fostered a culture of experimentation and have leveraged change champions. These champions want to understand the impact of their work and push their teams to carve out the time for instrumentation and reviewing insights.

Selecting the right tool for your team comes first. Then, there is training and education on the selected tool. This is followed by training and education on the chosen metrics framework. Then, define your metrics, and finally, establish how the data you need will be collected and analysed.

Things don’t always work out as planned, so you will need to have a way to collect feedback and iterate on your metrics.

We selected Amplitude as our product analytics tool of choice, and we also have Pendo for new products. Additionally, we use Qualtrics for some quals and Dovetail to analyse and generate insights from our customer interviews. We have introduced our team to the Mixpanel Framework, the North Star framework, and also the Pendo product analytics certification course as tools for improving their metrics knowledge.

It is important to start with the end goal in mind, decide how you want to measure it, and ensure the metric is actionable by the team. For example, if you are building a new feature, you should start with the desired user behaviour change, and how you will measure the change that happened.

Once that has been established, you’ll need to understand the data you have and if it is reaching your desired metrics.

The main motivation for the transition was our product teams' desire to have an objective and evidence-based approach to decision-making. We needed a way to know if we were achieving the desired outcome for our users before we moved on to the next feature. 

The other motivation was that we wanted to make analysis easier. We’d been relying on a tool that required product analysts to pull reports, and any follow-up questions led to a long cycle of additional insights. Removing this friction to get the best insights was important for us.

We started by selecting a vendor to help with tool selection and implementation of the selected tool. Then, we wanted to find answers to define key business questions whilst also being able to tag out applications for data collection and analysis.

As always, education was very important on the journey – we started to bring insights from our new tool to our meetings with our teams and stakeholders. Then, we followed up with more targeted training to support each team.

Clearly defining what you are trying to achieve or understand with your decision helps in determining the kind of data you need. Collecting both types at the same time helps the team to understand the full picture.

While quantitative data might give insights into what happened, you often need to survey or interview users to understand why the actions were taken.

Conversely, users might express unconstructive frustrations with the product when submitting qualitative insights, making a feature come across as worse than it actually is. Having both data helps to balance the actions/experiments you decide to do as a follow-up.

Leadership commitment to data-driven decision-making is the starting point for creating that culture. In partnership with Amplitude, we did loads of training and education to develop data literacy across the organisation. We ensured that the tools we selected were easy for most people, and we worked to improve the data quality to foster trust.

We also regularly highlight teams making advances towards being data-driven and use the lessons learned from those teams to improve others.

To ensure our senior leaders prioritise a data-driven approach, we do regular presentations on the achievements of the teams using this approach. Given our enhanced analysis capabilities, each strategic initiative now has a metric to monitor its progress towards achieving our desired outcomes. This approach aligns data directly with our business objectives, ensuring that we consistently feature relevant metrics in all updates and presentations by senior leadership.

For example, when we developed a model to predict renewal based on usage, a new capability, we did a roadshow to get other leaders to prioritise this initiative for their product areas.

Success in our data-driven journey will be defined by our ability to achieve our business objectives and the impact data/insights have in helping us achieve them. Improved actionable insights will be another measure of success.

The frequency of experiments we are performing to optimise our products indicates the value we are deriving from our data-driven approach. Lastly, the culture shift that happens when data is consistently used to guide decisions both now and in the future.

Further reading