DA Hub Club Munich - Agenda

Please note this is a live event; all times are Central European Time.
We do not record the discussions to ensure delegates are comfortable to share openly and honestly.

8:50 Virtual Lounge (checking your tech & meeting other delegates)

9:00 Opening Welcome

9:15 First Discussion Round

Round 1

April 24th

9:15 - 10:35 CET

Fabio Casutt

Developing a truly effective analytics programme requires a good balance between people, processes and tools (in this order!). In addition, the team must develop a high level of credibility with other parts of the organisation in order to (a) be heard and (b) given the necessary funds and support to implement and run the programme.

In this discussion delegates will be encouraged to share their views and experiences of what makes an analytics programme great. We will look to answer the following key questions:

  • How to build the right team for your organisation?
  • How to place the team strategically and evolve it?
  • What to invest in and how much (data QA, analysis, reporting, knowledge transfer, team education & development etc.)

Join Fabio Casutt, Head of Analytics at Swiss electronics retail chain, Interdiscount, to explore the critical intersection of people, process, and technology and how they impact your analytics success.

  • Have a stakeholder centric setup of your team instead of a technology centric setup
  • Having good communication skills is essential for analytics teams
  • Have great mentors supporting the team: learn from the best to train your analysts to be the best
  • Ensure there is a constant flow of information between analysts and users of data
  • Setup Single-Point-of-Contacts (“SPOCs”) or Data Champions within your stakeholders’ organisations:
    • SPOCs should be very motivated to work with data and analytics
    • They should have time to work with analytics - not just another task “on top”
    • SPOCs should have influence within their organisation to change the way decisions are being taken, thus, their leadership should be listening to them
    • You need to invest in extensive training of your SPOCs
    • But beware – this may not be a sustainable path if your organisation is experiencing high personnel turnover
  • Ensure your stakeholders have access to self-explanatory tools and you provide them with short but detailed “How-To” videos, so they can accomplish their most common tasks easily
  • This should also ensure that your stakeholders keep using the tools regularly and do not need frequent retraining
  • Always tie any requests for data or analysis to a business outcome
  • QA is important but so is trust in your team’s work. Do not let QA slow you down too much

It's 2020 and gender equity on the work floor is still an important topic. In several countries Women in Analytics meetups are well attended by both men and women. How can we participate in this topic?

Let’s have an open conversation – what success stories do you have to share with the group? What techniques have you applied in your organisation to attract and retain more women analysts? How do we address the differences in equity perceptions among men and women? How do we achieve a better gender balance in senior leadership positions? Where do we need to focus our energies for the next year?

The Women in Analytics discussions at the DA Hub conference tend to be the best rated by both women and men. We welcome both genders to make valuable contributions each from their unique perspective for what continues to be an exciting discussion about our profession.


  • A diverse team leads to better results since it helps us understand different opinions and perspectives
  • A diverse team helps us create a safe environment for everyone
  • Diversity attracts diversity, females attract other females
  • To let everyone feel more included starting tomorrow, address all colleagues with:
    • 'Hey people' instead of 'Hey guys'
    • 'Let's have food & drinks' instead of 'Let's have beer & pizza'
  • Having a good experience with female leaders will inspire others to attract more women (also in senior leadership roles)


  • Find a mentor (either gender) who will help you along the journey. This can be someone you aspire to be on personal or skills level. She/he can be internal or external to your organization. Personal click is the critical aspect
  • Be a leader in heart to become a leader, irrespective of your title
  • Be an influencer in your organization
  • Focus on what you CAN do and stop focusing on what you cannot do
  • Step out of your comfort zone, e.g. take the stage at a Women in Analytics event
  • Take risks of failure to achieve success


  • Focus on adjusting your job description to attract more women in your team:
    • Try to focus on soft skills (e.g. overall analytical thinking, attention for detail, being thorough) instead of hard skills (e.g. technical knowledge or tool XYZ).
      • Hard skills can be shared and taught, soft skills are much harder to teach
      • Women might have the technical skills required, but are more critical of themselves regarding hard skills
    • Studies show women tend to want to check all the boxes, even though they might check 9 out of 10, they might feel they are not qualified. While men might say they are qualified while checking half the boxes
    • If you want to focus on hard skills, try to generalise them, e.g. experience with digital analytics tools instead of Adobe Analytics/Google Analytics
  • You can do branding for your team by doing presentations at (Women in) Analytics events to create awareness among all analysts, so also women. Show them what cool things you are doing to encourage women to apply to work at your company

Simon Pilkowski
Boehringer Ingelheim

Matrix organisations, where reporting relationships are set up as a grid, or matrix, rather than in the traditional hierarchy, have been around for many years. This structure is particularly common in companies that run functional centres of excellence alongside business-specific professionals and processes. So how do we manage analytics effectively in such organisations?

Join Simon Pilkowski, Global Lead Digital Performance Management at pharmaceutical giant Boehringer Ingelheim, for an engaging discussion about analytics in matrix organisations. We will examine the following questions:

  • What are the most common analytics team structures in a matrix organisation? What can we learn from each?
  • How to define analytics accountability?
  • How to avoid duplicating analytics efforts

If your organisation is matrixed or at least partly matrixed then you will find this conversation valuable, helping develop new ideas for your analytics programme.

In order to successfully manage digital analytics within a matrix organisation it is critical to have a common vision across different functions. Especially given the fact that a matrix structure can lead to:

  • More stakeholders being involved in decision making
  • A lack of accountability (given reporting lines)
  • Matrix only works with a constant synchronisation across groups

Analytics leaders must:

  • Let go of strict control and empower analysts to shine (distributed accountability)
  • Build strong communities of exchange and share best practices
    • One participant shared how they are leveraging Slack for communications across multiple working groups including analytics and business stakeholders. Have seen a significant improvement in outcomes and stakeholder satisfaction
    • Another participant shared how they leveraged chatbots for both internal (stakeholders to analytics team) and external use
  • Define clear roles and responsibilities between analytics and business teams

Analysts must:

  • Understanding the business context is key to successful stakeholder management
  • Speak your stakeholder’s language
    • Must understand what stakeholders need
    • Analysts like to do analytics (and tick all the boxes) rather than have a back and forth idea exchange with stakeholders that would help optimise outcome of analysis
    • Share business analysis use cases with other analysts rather than the normal focus on technical use cases
  • Ensure data interpretation is done correctly by upskill the organisation
  • Service offering & accountability:
    • One approach used by several participants was of a product service model – where the analytics team has defined products
    • Some participants suggested that a product service model can lead to stakeholders treating the analytics team as an external agency which would be asked to do everything and anything
    • To avoid this situation it is critical to have good leadership and a clear centralised mission statement defining the role of the analytics team
    • Another participant raised the concern that a product service model could lead to analysts becoming too reactive – too much of a service provider and not offering enough strategic thinking. Could lead to a diminished appreciation of the analytics function (data supporting evidence rather than data supporting growth)
    • The analytics team leader should have a deep understanding of the subject matter to lead effectively
    • A unicorn would be ideal but hard to find/shape. An alternative would be to have a strong assertive leader to deal with the politics and relationship management and a deputy that is incredibly good at the subject matter

Additional general takeaways (some highlighted above):

  • Communication is critical, more so in a matrix org where reporting lines vary
  • One Size Fits All does not work well in matrix structure. Thus, the analytics team might need to sacrifice consistency (and with it some efficiency) to achieve greater stakeholder engagement
  • That said, because managers in a matrix org have more roles, efficiency is critical. So a constant balancing act

Janine Lee
Ringier Axel Springer Switzerland

The success of your analytics team is dependent on efficiency. Objectives & Key Results (OKR) is a goal setting framework designed to align company, team, and personal objectives, measure their success and increase efficiency.

We invite both managers and senior analysts, whether using OKR or any other framework, to share your experiences about improving analytics team efficiency. In this discussion, led by Janine Lee from publisher Ringler Axel Springer, we will share views on how to apply the OKR framework to analytics teams in order to push their output beyond their comfort level. Questions we will cover include:

  • What are suitable objectives analytics teams should achieve?
  • How can the outcome of these objectives be translated into measurable key results?
  • How to organize and run productive OKR workshops within your analytics team?
  • How to derive the right initiatives of your analytics team to achieve your defined goals?

Expect to come away with practical ideas to increase your team output and quality of analytics work.

  • Overall benefits of OKR framework:
    • Motivation to work towards a challenging goal in a clearly defined process
    • Justification to budget analytics team resources according to priorities
    • Measurable feedback on achievements
  • Objectives should be aligned with the overall company strategy:
    • In an ideal scenario, an analytics team gets top down OKRs from the company but with the liberty to create an internal team independent rollout
    • In that case, align objectives closely to the analytics team immediate stakeholders
  • Generally, external vs. internal objectives can be differentiated:
    • External – closer aligned to top line company vision e.g. improve sales, make progress in certain product or market areas, be the best news platform --> works best for decentralized teams where the analysts are closely integrated in the product teams
    • Internal – closer to the analytics function in the company, e.g. improve infrastructure, better data quality, introduce self-service culture, get people more data-driven --> works best for centralised analytics teams
  • Recommendation on how to apply OKRs:
    • Define quarterly or half annual cycles, work with 3 – 5 objectives that are bac ked up by 3 – 5 key results
    • Define concrete initiatives to achieve the objectives
    • Even if the objectives & key results have external origin, each department and team can define their own initiatives
  • You can organize the OKR roadmap in chapters, sprints, quality cycles and check-ins
  • Measure the performance percentage completed or binary result and check constantly on progress
    • Set ambitious, challenging goals that are not easy to achieve. "70% is the new 100%!"
  • Results should be OUTCOME rather than OUTPUT related
  • Possible range of key results
    • Project completion
    • Usage of tools (e.g. login rate)
    • Audience metrics (revenue, conversion rate)
    • Allocated impact of A/B testing
    • Stakeholder feedback through survey
    • Estimate on cost/time saved
    • Number of tags implemented...
  • Recommended reading – Measure What Matters by John Doer
  • Tool recommendation:
    • Workpath
    • JIRA

Marco Linz

Many if not most organisations today consider themselves data-driven. Yet we still seem to convert only a fraction of our digital visitors into a transaction (purchase, subscription, download etc.). Obviously, we all know that not everyone is ready to transact with us at any given point in time. However, are we still leaving too many visitors disappointed?

In this discussion, led by Dr. Marco Linz of data-driven marketing consultancy Feld M, we will examine this topic and look to answer the following questions:

  • What makes for a truly effective data-driven company?
  • Why are cross- and up-sell rates so low?
  • Which hurdles prevent most of us from leveraging the data we have in order to optimise our offerings, gain competitive advantage and push conversion rates up?

Come share your experiences and views so together we can tackle some of these fundamental challenges better. You will leave with practical advise to improve both your analysis and optimisation programmes.

To set the scene we had a look at current info on the use of data. We agreed that conversion rates in eCommerce are generally in the 3-5% range.

Reasons for the phenomenon we discussed were lack of…

  • Data availability
  • Data understanding, education and data-driven culture
  • Data sharing because of silos and data ownership fears
  • Communication in technical terms instead of natural language
  • Questioned expertise of website managers, marketeers, writers, c-level etc.
  • Willingness to share actual performance figures

We identified these levers to overcome these obstacles:

  • Let the customer speak to make the need for optimisation transparent (e.g. through usage testing, feedback questionnaires of non-converters etc.)
  • Get all stakeholders involved with the CRO program to jointly learn about the end customer, decide on optimisations and share successes and failures
  • Define jointly the appropriate conversion goals and “truth” (e.g. visit-based, user-based, purchasing-behaviour based, segment based etc.)
  • jointly and constantly review your data and update the optimisation roadmap so it becomes a continuous natural process. Will help educate stakeholders and change the culture

We identified these levers for improving conversion rates:

  • Optimisation of the product or service (process) itself instead of the pure conversion funnel
  • Big bang instead of baby-step testing
  • Focus on high-yield customer segments first (identify and scale)

The customer journey has been a catchphrase in digital marketing and ecommerce for many years. Optimising the path through different customer touch points is the end goal of the effort to understand these journeys. The toolset for analysing customer journeys has always been wedged between overly simplistic views on journeys with one-way funnels and overly complicated methods such as multi-layered Sankey flows where it is extremely difficult for humans to extract actionable insight.

In this session, led by Steffen Methling of OTTO, we will explore what has worked in the past and whether there are any new methods fuelled by advances in machine learning to extract more meaningful information from user behaviour across touch points not easily fitting into classic AIDA funnels.

Come share your customer journey analytics experience and together we will emerge with tangible outcomes you could apply at your organisation following this event.

  1. Define (model) YOUR customer journey:
    • Map and analyse touchpoints with your customer by department, use customer research, UX lab, questionnaires to get the customer perspective on what the journey is
    • Understand the business process and value chain and find touch points there. (what kind of data is collected at each touchpoint and how to connect them)
    • Some customer journeys are very long and include onboarding, product use lifetime and reengagement
  2. Data acquisition
    • Workshops to understand the customer journey across the whole business. Create a data landscape map - of the customer and the data collected from devices / touchpoints.
    • Have a problem of data defragmentation between systems and multiple products Need to tidy up data and realise what data is relevant. Process template for data:
      • Compliance check of data items like GDPR
      • Assure correctness of data items
      • Scrutinise value of each data item for your customer journey, to be able to focus on key events (less is more)
    • Question exiting modelling – they might not fit your business
  3. Analyse your journeys
    • One participant shared that they use an algorithm (spade) to surface common sequences/common journeys - produce a long list of items. The challenge is to visualise and then segment it.
    • Use journey sequences to present a likelihood of success for any step in between
    • Look at engagement rather than final revenue - where we can make a direct contribution
    • Efficiency measures vs. long term KPIs, start with efficiency and micro-conversions in journeys than expand to longer time horizons
    • Stick to more ‘Efficiency based” measurements (e.g. do they move to the next step?) rather than looking at the impact of every touchpoint to the final goal (micro conversion rather than contribution to macro conversion)
    • One participant introduced the (interesting) concept of Micro-cohorts where you also measure time to move from one stage to the next (as part of your analysis)
    • Another participant had success with using categorisation of activity – experience leads to activity – look at cluster of movements
    • Consider the surroundings not just the customer. in shops – the experience changes the customer journey. Contextualisation works well to explain differences in journeys and metrics, including seasonal effects, time and any broader marketing campaigns that ran in parallel
  4. Journeys and machine learning
    • Different journeys (short, long, etc.) from ML perspective - do not design if upfront:
      • Use the data to readjust the model and the actual journey (with A/B testing), let it evolve. ML can only use info already in the data
      • Coronavirus – new event that ML has not seen before – ML cannot react in time
      • Use data mining and predictive analytics to react fast to changing patterns in your touchpoint data, i.e. changing patterns in content consumptions to predict purchases in times of lockdown
    • Multiple touchpoints - need to consider all
  5. Final remarks
    • Participants agreed that stitching up the data is the biggest technical challenge but tools getting better in enabling such stitching
    • Still too much focus on the business perspective rather than the customer perspective of the journey. This will only be solved through continuous education of business stakeholders
    • Attached below are the results of a poll conducted at the start of the discussion to gage participants areas of interest

Ole Bahlmann
Asana Rebel

Many, particularly non-analysts, might consider growth analytics and product analytics as one and the same. Both disciplines examine customer behaviours and share common metrics (e.g. revenue generated, purchase and churn rates, repeat visits and retention rates) but serve different purposes. The main purpose of growth analytics is to analyse the health of the business whereas product analytics is particularly concerned with measuring engagement, informing product development and improving customer product experience.

Ole Bahlmann, VP Data, IT & Customer Care at Asana Rebel has vast experience using both disciplines. In this session, we will discuss and debate how to make the most of product analytics including:

  • Use cases for improving product retention, customer experience and reducing churn
  • What are the essential metrics and KPIs for product analytics?
  • How best to manage the relationship with product team/s?
  • What should your product analytics tool stack look like? Do you need a dedicated product analytics platform?

We welcome practitioners using GA/Adobe Analytics/other for product analysis as well as those using (or looking to use) a dedicated tool. Expect to come away with a fresh perspective on product analytics.

Why do we make a distinction between product analytics and other areas?

  • Not sure if marketing and product analytics needs to be separated
  • Does product analytics include the design of products?
  • Some participants made the point that Marketing analytics focuses on the top of the funnel (user acquisition) while product analytics often focuses on the latter parts of the funnel (user activation and retention)

How do we define product analytics?

  • Some participants distinguish between marketing and product and some do not. This seems to depend on the product or service sold
  • A question arose relating to how important it is to understand the user?
  • There is the potential for difference when interpreting the meaning of “product analytics” if making a distinction between the site or the app AS the product vs products that are sold on a site
    • Many participants seem to view product analytics as the app or site and how that works, with less regard to the items being sold on the site
  • For example, the difference between a retailer and a publisher, where the content IS the product Analytics team structures:
  • Where there are multiple product owners and teams, some larger organisations will use hub and spoke models where some specialities, such as optimisation, sit in the hub and analysts sit with each product unit
  • Others see analysts sit in a Centre of Excellence that serve all product groups / owner / stakeholders
  • In some orgs UX may fall outside of the product analytics team(s)

Analytics practices:

  • Some are moving digital analytics data out of GA / Big Query and into cloud solutions such as Azure where it is blended with other sources of BI data
  • This opens up the realm of product analytics as data from other sources is blended together to form a broader dataset

Is the skillset needed in product analytics different compared to other types of analytics such as marketing analytics?

  • There may be a question around domain knowledge e.g. ecommerce analytics vs publisher analytics
  • Product analytics does not have its own methodologies per se
  • But you do need basic expertise around data and some specific tools
  • Additionally, the required skill set for a prac titioner invoked with product analytics depends on how a business is tooled and organised
  • Collaboration between different disciplines is key to the creation of an effective product analytics unit
  • Thinking about softer skills, the group addressed the question of whether product analysts should be good at putting themselves in users’ shoes?
  • The group agreed that some aspects of digital analytics facilitated just that. These were:
    • Qualitative analytics (I.e. user studies)
    • A/B testing (data
    • Segmenting
    • Surveys
    • Point of interaction polls
  • The group agreed that the analyst must possess the capacity to empathise with the customer
  • Following from that, the need for empathy means that requirement to collaborate with other disciplines is more important
  • The question of whether psychologists also have a role to play on the realm of product analytics also cropped up. This was seen as Something that may become more prevalent in the future as analysts (and other disciplines) begin to weave a deeper understanding of user psychology into their skills toolkits

Are there good frameworks / standard KPIs in the realm or product analytics?

  • HEART framework often used in UX used by Google
  • Knowing the right questions to start with is a vital but hard


  • User analytics IS product analytics

10:35 Break

10:45 Second Discussion Round

Round 2

April 24th

10:45 - 12:05 CET

The world is changing and most probably it will never look the same as before. The unknown is the new variable to take into consideration while developing business strategy and contingency frameworks is the new ground for any planning process. How can digital analytics and data driven approaches boost your way to business success in these disruptive times?

In this discussion, Emi Olausson Fourounjieva, will argue that it’s time to look into the depth of the challenge and find the enablers that data and analytics can enhance. If performed correctly, it will not only help your organisation going through the difficulties during crisis but also help to a faster restart and gaining a stronger position on the market in the post-crisis period.

Topics we will cover include:

  • Agility in activities and priorities
  • Right data to right people at right time
  • Actionable data as enabler of creative thinking and optimisation
  • New initiatives for staying relevant
  • Focus on ROI

Focus will be on concrete game changing solutions and best practices to find those 20% of activities which will help generating 80% of successful results according to the Pareto Principle.

What can be done now to manage data in a crisis?

  • Monitoring for changes. Verify or dismiss historical changes. Keep CX and Brand in focus
  • What will the impact be on Swedish behaviour if Sweden gets a lockdown - and what can other countries that adopted a different approach learn from Sweden while opening up again?

In addition to the topics mentioned in the programme description - Agility in activities and priorities, Right data to right people at right time, Actionable data as enabler of creative thinking and optimisation, New initiatives for staying relevant, Focus on ROI - there were other topics highlighted:

  • Corporate strategies in the disruptive times and importance of connecting any activities to the Key Business Objectives. Potential opportunities of the new data-driven collaboration points
  • Mid-term (2020-part of 2021) and longer term strategies and recommendations
  • Marketing activities, dos and don'ts in the times of crisis

How is marketing impacted by the emergency / crisis?

  • Depends on the organisation and its marketing model: online + offline vs online only
    • Brand building is an important part which might be easily damaged if not adapted to the Voice of Customer and to the Long-term Brand Strategy
  • When the shops are closed down marketing will push traffic online but user behaviour will be different compared to what it might have been if the stores have remained open
  • Online businesses seem to be doing better than on + offline in the current crisis (#samplesizeofone)
  • Example: UN - detailed emergency protocol in place and applied
    • It might be a good idea to have such a protocol in place
    • The Customer 360 view is enabling more relevant analysis

Much of the discussion around the current crisis. In the context of COVID19 it is hard to tell what will happen in three months from now, there is no such situations historically to compare 1:1

  • More relevant Ad-Hoc analysis in order to discover the latest changes in user behaviour
  • Cannot compare what happens in West vs what is happening in Africa because economies are so different
  • Many Google paid algos have performed badly over the past few months. Reluctance to hand optimisation to an algo in the current situation
  • Problem with crises is that by nature they can be unpredictable which does not work so well with AI
  • Move away from reporting in times of crisis because historical data is inherently less relevant
  • Much harder to build predictive models due to changed nature of incoming data during a crisis
  • Now could be an opportunity to reenergise the connection between C-level / business managers / stakeholders and data teams, as businesses start to pay closer attention to the data and see its importance
  • Asking the right questions becomes more important in a crisis
  • Can modelling be used to control user behaviour during the times of crisis? Hard to do when the dataset is not relevant enough


“Learning and innovation go hand in hand. The arrogance of success is to think that what you did yesterday will be sufficient for tomorrow.” William Pollard

"Never waste a good crisis" Winston Churchill

Sandra Kampmann

To achieve your vision of what analytics can do for the organisation you need a team. Your analysts are your front line – their work reflects on you. They are talking with the business, interpreting data, crafting presentations. They need to understand the customer point of view, to develop the consulting skills to understand and serve the business, and the ability to not only interpret the data but put it together into a compelling story. They need the exposure to develop their careers.

That is a tall order for one job description – consultant, researcher, data expert, visualiser, storyteller. For Sandra Kampmann, Head of Analytics & Insight at ASOS, this is a daily challenge managing her team. In this session, she will lead a discussion on how to define development opportunities for analysts, help them hone their skills and keep them motivated.

Join us for a discussion that will offer tangible benefits to both team managers and senior analysts.

The questions set out to answer:

  • Where does the team sits in the organisation and what expectations do you have of the analysts?
  • Who is responsible for developing analysts, and are there structures in place to enable this?
  • How is knowledge shared and what aspects are important to share?
  • How do we coach our teams to operate better in the soft skill department? • How do you ensure analysts know what is expected from them?

The group managed to cover most of aspects of the questions above. The key points are included in the summary below.

Overriding themes were the difficulty in coaching and developing across a variety of skill sets (technical vs. soft skills) especially considering personality types within analytical roles. It was agreed analysts had a thirst for developing technically, and the group discussed how to balance time and job requirements with that thirst.

Many ideas around knowledge sharing were offered:

  • Friday afternoon play days where groups could discuss certain topics
  • Teaching sessions open to all, where an expert presents a topic and use cases
  • Walk in sessions open to the business where analytics teams can present/share relevant analytics business solutions

Participants shared thoughts about the responsibility of analysts' development - is it with the manager, analysts themselves, external opportunities like courses or conferences, mentors and buddies?

Many different ideas had positive outcomes, however, all participants agreed that in most cases this was around an informal structure within the team or business. Another very interesting point was around soft skills development and how we push analysts out of their comfort zone around relationship building and presenting. We agreed analysis and insight wins at relevance with commercial awareness, and analysts should be encouraged to speak to people in the business. The final discussion point was around how to ensure analysts know what is expected of them.

Again, there were a variety of suggestions and use cases:

  • An expectations or competency framework that highlights to the analyst the skills that are expected, especially those outside of the technical realm
  • A semi-annual, structured, coaching review
  • Lunches with HR to establish relationships, anonymous surveys with stakeholders to understands gaps
  • Mentoring and/or buddy system

A well-balanced discussion with many tangible outcomes for participants to think about, try implementing as is or modified to their specific circumstances.

Till Buettner
Deutsche Post DHL Group

We are all working remotely at the moment. Whilst this situation is forced and temporary, the likely long-term impact will be increased remote working. That would represent a new challenge for many team managers, particularly in analytics where we rely on daily contact with various stakeholders to do our job well.

In this discussion, led by Till Büttner, Head of Digital Analytics at DHL, we will explore this new reality and look to answer the following questions:

  • What process adjustments are required working remotely (e.g. time management, team communications, analysis QA, internal support etc.)?
  • Is there a single operational model or multiple models for remote working?
  • Does analysis quality and consistency across the team change whilst working remotely?
  • How best to nurture and motivate talent in an increasingly detached work environment?

We invite both managers and senior analysts to share experiences and help all participants make the most of remote working.

Some interesting points for pre-discussion review:

Most of the discussion revolved around these common three points: Team, Processes, Tools. In that context, we discussed motivation, fear, importance of communication, the number of tools out there.


  • A daily poll of team members’ wellbeing and the possibility of talking about it helps members to express themselves
  • It is ok to moan and complain where it helps release stress
  • Having coffee chats and/or “Open Bar” (e.g. an hour a day, could be over lunch time or immediately before/after), allows everyone to get together regularly
  • There are creative ways of saying thank you to colleagues (online greeting cards, greetbot for slack, …), which brings with it an appreciation
  • Communication skills are important – active listening is critical more than ever before
  • Prioritise video calls for critical calls (e.g. feedback for team members, colleagues); helps with engagement and interpreting facial cues
  • Room for relaxing work hours – It does not matter where/when you work so long as work is done
  • Flexibility and leniency with team members will pay itself once the crisis is over


  • Change within Change could be a problem. Do not change process or tools too much while starting to work remotely, as changes often triggers fear
  • Stay with learned patterns as least until working remotely feels more normative
  • Clearly named deliverables, milestones, achievements help mitigate misunderstandings
  • Structure your own day but give your team members the freedom to structure theirs as they see fit (and efficient)
  • Planning is key in these times to avoid constant distraction


  • There are too many tools on the market
  • Commit to critical ones within the team as too many tools can be distracting
  • Most tools in each area do the same; avoid entering a “tool war” (Slack is better than MS Teams etc.); go with the most popular within the team

Further reading & podcasts suggested by participants included:

Final point from discussion leader:

  • Have confidence in your team and yourself. In times of remote work, trusting each other is the most valuable asset you have. Take care and be there for each other

Yael Farkas
Parfümerie Douglas

We are told that every company is now a data company, and yet we wrestle with a variety of problems related to getting managers to value, understand, and use data in their day-to-day work.

In this discussion we will explore the topic of data culture and discuss what self-service means in the context of our own businesses. Moderated by Yael Farkas, Team Lead Digital Intelligence at Douglas, we will look to answer the following questions:

  • How do you develop a strong data culture? Is it a top-down mandate from executives, a grass-roots campaign led by data staff, or a bit of both? What tactics actually work to build and sustain a data-driven mindset?
  • Who are the different customers of self-service analytics? What motivates them? How do their needs differ and how can you best enable self-service for each group?
  • What are the dangers of self-service? What could possibly go wrong once you empower everyone to become their own analyst? Or, to put a positive spin on it, how do we make data literacy pervasive?

 If your analytics resources are overwhelmed, you are wrestling with your organization over the adoption of self-service analytics or you are in data-driven nirvana then this is your opportunity to share and learn how to maximize analytics ROI via self-service.

During the introduction round participants exhibited a clear interest in focusing on two main challenges:

  • How to foster a data-driven culture
  • Do's and don'ts for self-service and to what extent self-service is healthy

Fostering a data driven culture:

  • Key Factors: Trust & Understanding the data
  • Data must be in context for stakeholders
  • Provide benchmarks to help stakeholders evaluate performance more effectively
  • Aim for a high level of data quality but do not try to be perfect (80/20 rule)
  • Be an ambassador! Showcase your analysis wherever you can bring value
  • This will lead to trust in the department leading to trust in the data and derived actions

Do's & Don'ts of Self-Service

  • Coach stakeholders to ask business questions, not ask for specific data
  • Evangelise the importance of looking at trends and correlation of data with the stakeholders’ business needs rather than them focusing on specific numbers (as they too often do!)
  • Analytics team can enable self-service:
    • Either through visualisation tools such as PowerBI, Tableau and others
    • Or with direct access to the analytics tools
    • With both tool types you can limit data access and drilldown options (reducing the risk of misinterpretation) but will require extra work (and budget) for analytics team
  • Be very careful and sensible with self-service options:
    • The is an obvious temptation to open all/most data and apply true data democratisation
    • However, too big a data model for too many people would inevitably lead to paralysis
    • Instead consider splitting into access levels
    • Continuous training is essential for success
  • Aim to achieve a common understanding of the necessary data and definitions to avoid work duplication (and potential frustration)
  • Ensure stakeholders can easily access the right data at the right time
  • Look for scalable solutions in all aspects:
    • Allowing non-analysts to work with data but also being trained and assisted by the analytics team in self-service
    • Goal: self-service should enable freeing up analytics resource for higher value exploratory work instead of consuming more and more time
      • Should also lead to higher analyst satisfaction and better talent retention

Arne Ruhkamp
Telefónica Germany

The demand for analytics and commercial insights is high and increasing. Most analysts face more requests than they can handle. Managing workloads and hitting deadlines is challenging. Consequently, identifying and prioritising the urgent and high value requests are crucial if analytics teams are to be effective. However, value and urgency can be subjective and dependent on particular stakeholders’ point of view. How do we objectify the process?

In this discussion, moderated by Arne Ruhkamp of Telefónica Germany, we will seek to answer how to best manage analytics requests. Participants will share experiences regarding:

  • What is the acceptance criteria for a request?
  • How to keep an overview of all the analytics requests in your team?
  • How does the prioritisation process look like and how to make it transparent?
  • What is constructive stakeholders’ feedback? What should be ignored and how to manage their expectations?

Come share your experiences and learn how others are successfully incorporating a request management process into their analytics practice.

  • Participants included a variety of industries such as automotive, ecommerce, finance, telecom and start-ups
  • Everyone agreed that number of requests outstrips the available capacity. But simply increasing analytical headcount/capacity won’t solve the problem
    • There is a need for a strong analytics lead able to ‘mediate’ between the analytics team (naturally introvert) and business stakeholders (that are not intrinsically interested in data bur rather what it can do for them)
  • Some suggestions for overcoming the challenge:
    • Only accept requests backed by a clear business hypothesis and where the stakeholder can demonstrate that they can act based on your analysis; should help avoid curiosity requests
    • Ticketing systems such as JIRA can increase efficiency but must not become a prohibitor to valuable requests (e.g. stakeholders not comfortable using the tool or the format doesn’t help them provide their desired request:
      • Set up a wiki to explain how to use the request system
      • Work with engaged stakeholders to tease out what they’re trying to achieve; if you generate a positive experience these stakeholders will become advocates which will help with getting your system more widely accepted in the organisation
      • As a continuation of the previous point, don’t limit the acceptance of analytics requests to a ticketing system if you wish to be seen as a trusted advisor/consultant within the business. It might make your life slightly harder (particularly at the start) but would increase your personal standing in the org and value generated. Could also help you identify how you can improve the ticketing system
    • Equally, Ticketing systems can turn analysts into technocrats. They may focus too much on the parameters set in the request ticket rather than exploring the question further
      • One key suggestion was to encourage analysts to dig into the data, especially if they feel there is more value to provide stakeholders than what they requested (e.g. hidden insight derived when segmenting the data or any anomalies identified etc.)
      • Analysts should also be encouraged to engage stakeholders with questions and propose modifications to requests
    • A Single Point of Contact (SpoC) in the analytics team keeps communication with stakeholders simple and prevents requests from being made multiple times to different employees
      • Participants agreed that a systematic/bureaucratic approach is critical for success (even if creates challenges); otherwise, urgent and HiPPO tasks take over rather than the highest value tasks; this will ultimately diminish the standings of the analytics team in the eyes of senior management
      • Frameworks such Objectives & Ker Results (OKR) can help analytics teams stay focused on the most valuable tasks but also provide the justifications for rejecting/deprioritising requests (even from very senior managers)
    • Other points discussed:
      • The importance of analytics team visibility within the org
      • Branding the team was important to help establish the team and then focus the offering
      • That would also help what the team doesn't offer – important that stakeholders have clarity and understand this point
      • Self-service can reduce the number of requests but participants agreed this is more relevant for reporting and analysis of such data rather than for deep dives
      • There were no examples of analytics request automation
      • We discussed having a project manager to help optimise the process:
        • Some participants had previously experimented with having a PM
        • Results were mixed but leaning towards the negative
        • Most participants agreed that:
          1. It could be pretty demotivating for a PM to deal with ticketing
          2. To manage the process effectively would require the PM to gain significant subject matter knowledge in analytics

Isa Kes

The purpose of a data strategy is to set the foundation for an organisation to leverage its data and the opportunities it could offer. But all too often, companies struggle setting up a data strategy to address their needs and environment. Obviously, it is not enough to simply implement good technology, which everyone else has, or to collect huge amounts of data, or perform advanced analysis. But what is it, that makes up a good data strategy?

Come share your successes alongside challenges with setting up your data strategy. In this discussion, moderated by Isabelle Kes of FELD M, a consultancy for data-driven marketing, we will collectively answer the following questions:

  • What are the components of a successful data strategy?
  • What are the key challenges when setting up a data strategy? What are drivers of success?
  • How to avoid data initiatives becoming an end in themselves?
  • How to translate your data strategy into an analytics roadmap?

Expect an exciting discussion with tangible outcomes to help you make the most of your organisational data.

The discussion started in agreement that having an elaborate tech stack, a huge data set and/or conducting lots of analyses is not enough. That is not truly data driven. An organization must have a data strategy to fully leverage the value of their data.

The first step in creating a data strategy is to define and align objectives across the organisation. Otherwise, the data strategy will be ineffective.

To align on this objective is the crucial element. Once everyone is aligned, then the organisation should define the relevant use cases followed by the tech stack, processes and data architecture.

  1. Organisational structure:
  • A combination of a top-down approach (senior management sponsorship) and a bottom-up approach to ensure stakeholders accept data and are motivated to use it
    • Create ownership for the topic of data-drivenness within the C-Level (“Introduce a CDO between CFO and CMO”)
    • Decentralise responsibility throughout the whole organisation (e.g. shared responsibility to being data driven)
    • Enable cross-functional working teams (i.e. all aspects integrated – analytics, UX, IT, marketing, finance etc.)
    • Introducing roles like data ambassadors or data stewards can help create awareness of the potential of data as well as to identify data hurdles
  1. Data culture & mindset:
  • Participants agreed that lack of desire for data is the biggest challenge when setting up a data strategy. This is caused by either lack of data understanding or by routines, with stakeholders concerned of being challenged with data (“we are creatures of habits”)
  • Participants shared some great ideas and best practices on how to create acceptance and desire for data including:
    • Introducing show cases, highlighting added value and demonstrating how data can help solve business challenges (e.g. developing and using driver trees and delivering dashboards with the right KPIs). These activities will help create the need and desire for data-drivenness amongst stakeholder
    • Setting up values and mission statements for data usage to promote data relevancy
    • Introducing data show cases events
    • Supporting a mindset shift by data advocacy (e.g. the above-mentioned data ambassadors and stewards)
    • Setting up trainings and learning on the job (training on dashboards, Python, SQL, BI)
    • A realisation that data products such as analysis and dashboards are not developed for business units but together with them
      • This creates a greater commitment as well as better understanding of what is possible and what is not
  • Setting up a culture of "embracing mistakes and failing fast" instead of the fear of being measured and evaluated by data

12:05 Closing Note

12:15 End of the Event