Agenda & Discussion Summaries

Please note this is a live event; all times are Central European Time.
We do not record the discussions to ensure delegates are comfortable to share openly and honestly.

Monday, June 15th

12:50 Virtual Lounge (checking your tech & meeting other delegates)

13:00 Opening Welcome

13:10 Platinum Keynote

Data is what we use to fill the gaps left by the limits of our five senses. When data is of poor quality, it is a real handicap.

Declan Owens of AT Internet has conducted countless analytics implementation projects and has consultated on how to guarantee data quality at all levels. He has achieved great results of which the teams he has worked with several years ago still benefit from today.

Over the course of this short presentation, Declan will share with you four key tips to ensure data quality. These simple takeaways will empower you to offer reliable data for decision making to your stakeholders, whilst also giving you the keys to set up trusted and sustainable data governance in your organisation.

13:25 Round Table Discussions - Round #1

Each delegate selects and participates in one discussion from below

Round #1
Mon, June 15
13:25 - 14:45 CET

Allison Ma

So you are looking to grow your data team but unsure about best suitable roles and team organisational structure. You are not alone – many team leaders are confronted with challenging questions regarding their talent recruitment and retention strategy, particularly with the continuous role and skill evolution in our industry.

Allison Ma, Head of Data Science for Wagestream, is currently facing these challenges. In this discussion, she will guide us through the following questions:

  • How technical should your hires be?
  • Should you hire in anticipation of company growth or after growth?
  • What about hiring for current projects vs upcoming projects?
  • When is a junior candidate more valuable than a senior candidate?
  • Is there an optimal mix of levels and skills within an analytics team?

These and more will all be answered in this exciting discussion about analytics team growth.

We will look to answer all the questions above. We will also aim for both analysts and managers to leave with insights on how individuals fit into an overall team and how to piece a great team together.

  • Data team maturity framework: Scalable, collaborative, risk-taking, acclaimed
  • Who should be the early hires?
    • First person should be “jack of all trades”, somewhat of a unicorn
  • Who do you hire next?
    • A storyteller
    • Having partnerships with universities is useful
  • Optimal level and mix of skills
    • Soft skills matter
    • Need a mix of people: someone more into management and someone who is more of a specialist and detail oriented
    • Hard to start a team with only junior members
  • Tricky management question, how do you manage if you have people in your team with very different skill sets?
    • Character of the person is very important in this case
    • Breaking down projects you have less understanding of into smaller topics is helpful
  • How do you hone analysts’ skills?
    • Project management skills is very important
    • Challenge analysts with different skills
    • Keep goals of analysts in mind
    • If you do not have their skills, then you need to be able to speak their language but not know the details
    • Junior analysts love to be challenged and learn new things, so take this opportunity to challenge them
    • Send an analyst or scientist to business side to understand goals and processes
    • It is important to have a great senior analyst to mentor others
    • You should share WHY someone wants to learn a new skill, not just make them learn it
    • You should not force people to do something they do not want to
  • Hire internally vs externally?
    • Internally makes sense sometimes because they come with existing knowledge and different perspective
  • How do you manage analysts’ time while they are developing new skills?
    • 80/20 rule: 80% client work and 20% person development work
    • The team trains each other
    • Figure out what team members liked learning and did not like learning
    • Team members should have confidence and feel trusted, not controlled
    • Make sure personal development projects help the business as well
  • Does it make sense to take more time and then develop a specific skill?
    • It works and allows junior people to improve and learn new skills
    • Can give one day per month during which people can work on new topics
  • Development of more senior people
    • A lot of people do not want to be leaders, but rather be experts: allow them to do that with a new role that is expert not leader
    • Good to block a whole day as senior people are often swarmed with client work
    • Seniors have an understanding that they need to self-develop
  • How would the data team be perceived as success?
    • All senior people need to understand the team’s goals and that you need to push the boundary to achieve the goals
    • Regular check-ins with product/other teams regarding data team
    • Constantly building trust with other teams
  • Expanding the team
    • If you want to expand the team, you must show the outcomes and where you and your team contributed
    • Impact analysis on every project or decision backed up by data

Ian Thomas

GDPR has transformed the world of digital data and analytics, forcing organisations to fundamentally rethink their strategies around retention and use of customer data collection, as well as their use of third-party data.

This discussion, led by Ian Thomas, will focus on strategies and tactics that organisations can deploy to fill the gap left by GDPR rules and related restrictions (e.g. Apple’s Intelligent Tracking Prevention and Facebook’s restriction on third-party data brokers), such as gathering more first-party customer data, moving towards contextual targeting and other approaches.

Come share and learn how to manage and best utilise your customer data in an era of intensifying privacy regulation.

Confidence in GDPR compliance, two years on:

  • About half of the participants felt confident about the way they managed personal data in a GDPR context
  • Less confidence about GDPR compliance now than in 2018, due to confusing rulings from various EU data protection agencies, and rapid changes in posture from key technology players (especially Google and Apple)
  • Many activities are still being guided by risk-averse lawyers - much nervousness about even setting cookies in some situations
  • A good approach is to think about minimising (not eliminating) privacy risk - asking yourself "Do I have risk under control?"
  • Cookies still dominate the participants' thinking about privacy - other topics (secure storage, data subject requests, international transfers) are much less prominent

How GDPR has changed our use of personal data?

  • Several participants asking themselves, "Should I even bother with personal data? Is aggregate/site-level data enough for my purposes?"
  • There is a conflict between wanting to delete historical data for privacy reasons and being required to keep it for audit purposes
  • General unease about continuing to rely on third-parties for data collection (especially Google) - participants considering first-party data collection methods
  • Attribution of marketing spend will become increasingly challenging with reduced personal data, perhaps prompting a return to more brand (or even offline) spending

It is not new to discuss data as a product in relation to big platforms and expensive IT programmes but what about data products as everyday features and capabilities embedded within the teams that rely on them?

At Burberry, Ben Stephens’ team are building data products to be used across the business. In this discussion Ben will explore what we mean by data products and how they can be developed and integrated into teams and processes such that it becomes part of the business’ daily life. Using examples such as integrating data, insights and recommendations visually into a non-technical teams’ workflow we will look at the benefits that can be gained from showcasing how subject matter experts can be empowered with the right data at their fingertips. We will explore identifying candidates for when the right time is to introduce such capabilities and the potential pitfalls of becoming ‘IT Support’ for the data solutions we build.

We will finish by discussing how this might impact data teams, the kind of roles that become necessary – designers, engineers, journalists and potentially more embedded alongside analysts.

Join this thought-provoking discussion if you are looking to empower your business teams or you are looking for new ways to explore your data.  

What is a Data Product?

  • Discussed whether a dashboard would be considered
    • Good comment that data products must enable action therefore basic dashboard, no. Something more targeted, then maybe yes.
  • Can be internal (business) or external (customer) facing but primarily used by ‘another’ as self-service
  • Supports either better decision making or more automation in decision making process (bypass people)
    • Should reduce time to action
    • Should de-skill tasks so end users can do more
  • Data products are custom built, not off the shelf tools that can be bought into a business
    • Allows for evolution
    • Iteration is key for a product lifecycle
    • Start with MVP (had the term Minimum Viable Analytics) then build from there

Identifying Opportunities to create Data Products

  • Often uncertain of benefits of starting with something basic but aesthetic/quick/functional vs something more complex though common/recognisable to the business (churn, attribution)
  • Key callout, get stuck in and make a start rather than extended planning
  • Important to discuss them in terms of Products, which is a known concept to the business and inherently involves ongoing iterative process, as opposed to Projects which tend to have short term (maybe expensive) connotations
    • -->Productionisation of insight through productization of data
  • Call out that greater success had from solving problems/pain points vs introducing opportunities as solutions feel more tangible when improving something that is a struggle.

Enablers for Data Products

  • Rapport and relationships are important to establish foundations of building something bigger (especially for Consultants)
  • Big consideration is the ongoing maintenance and ownership once created and established. Additionally, important to maintain momentum in usage/marketing/PR for the product
  • Question raised about building things with sunset date in mind to 1) ensure iteration as v2 (etc) can be launched and 2) gives clear shut off date if usage dropping and no longer desire to have to support it. Can be easier than more formally retiring a product of limping on for a long time
  • Good point made on explain-ability being important for persisting products as understanding and ability to maintain/develop models or products is obviously important to their longevity and success.
  • Callout that Products need marketing in the business or to customers (depending on audience) so can be heavily dependent on factors other than quality of the product.
  • Key enabler for the product is sustainability – teach a man to fish etc etc.

How to support Data Products

  • No clear or obvious approach to support models – it depends
  • If customer facing than IT would need to support anything developed through data teams. Internal would likely be different with data teams taking more ownership
    • Concept of a data lab raised which nurtures early products/ideas then when developed passes off to IT
  • Support depends on ownership and buy in from the owner to keep pushing and marketing. Therefore, success can be a question of clear responsibility and accountability
  • Key callout – consideration for upstream changes and imported/updated data models which may impact maintenance work
  • Community support was raised as a good model where there was a pooled sense of responsibility and mutual support for data products. Support depends on both technical/data expertise as well as end-user domain knowledge
AT Internet

Data quality is something that all analytics experts support, as they know that it is essential to guarantee reliable decision making. But though we all support it, how many of us can say with full certainty that we have reached optimum data quality? Some of us sometimes wonder, is that even possible?

Declan Owens, Digital Analytics Expert at AT Internet, has tackled this challenge along-side many teams as a consultant. Over this discussion, he will invite us to share our own experiences feeding a debate along the following questions:

  • What actually is “Bad Data”?
  • Is Data Quality more about precision or transparency?
  • What do you do when you realise all the data you’ve been using historically is of bad quality?
  • How do you maintain quality against requests for quantity of data?
  • Once the data is clean, how do you ensure it stays clean?

So, whether you are already part of the brave analysts that dared to pick up that shovel and dig deep into their data, or whether you’d like to learn how one digs that hole and survives, you should definitely join this discussion.

Key highlights:

  • Data sobriety must be applied before all
    • Marketers and other digital trades are too used to having tons of data at hand
    • The excess makes it very difficult to control the quality of data
  • Data quality issues are often due to organisational issues rather than lack of tooling
    • It is important to have a data governance strategy and find the strategies to drive a data culture within the organisation
  • The main key to success is building a data culture within one's organisation, where teams understand each other and trust their colleagues to collaborate efficiently with the best result

The final word to conclude this talk is "Trust". It is essential for your collaborators to trust you as an analytics professional, for you to trust their knowledge of the business, and for everyone to trust the data.


What gets measured gets managed, thus, KPIs serve as the cornerstone for any strong marketing analytics programme. But are you sure you are using the measures that matter most? The wrong marketing KPIs could drive your marketing effort in the completely wrong direction.

In this huddle, Michael Böhme of Zalando, will argue that gaining alignment for your marketing KPIs is the key challenge; once gained, the subsequent steps are amazingly easy. Do you agree?

We will also tackle the following questions:

  • What are your top marketing KPIs and why?
  • How does your sales funnel impact your marketing KPIs (e.g. ecommerce vs. lead generation vs. brand)?
  • How to lead the KPI alignment process and manage stakeholders’ (sometimes) conflicting requirement?
  • What are you doing to ensure your marketing KPIs are used to optimise campaigns?

Naturally, KPIs divide opinions, particularly in marketing. Let us have a stimulating conversation, even a debate, and share views on what works and does not work in establishing marketing KPIs that lead to business action and long-term growth.

Best Practices for Implementation of Marketing KPIs

  • Marketing KPIs are never done
    • A KPI development programme is essential to align strategic company level KPIs with tactical lower level KPIs for teams
    • Crucial when company objectives are changing (i.e. growth vs. business continuity during crisis)
  • A successful Marketing KPI framework should:
    • Establish a coherent picture about the customer
    • Provide a bedrock to all dashboards and analysis
    • Sometimes there is little accountability across teams, when “everyone owns the customer”
      • Especially true for customer acquisition
      • May lead to diffusion of responsibility and poor results
  • High level Marketing KPIs are not as effective for marketing optimisation.
    • Actionability comes through breakdowns
      • By stages in the customer lifecycle
    • Other meaningful metrics circle around RFM models and acquisition sources
      • Focus on the customer segments that drive the most growth and value
  • Testing of paid marketing campaigns is crucial
    • Often testing frameworks are missing or badly executed
  • Ensure to include ‘negative’ short-term metrics such as return rates
    • Work towards understanding how these affect long-term metrics like LTV


  • Poor data quality and lack of reporting automation are big obstacles for successful Marketing KPI implementations
    • Start with manually reporting a new KPI, build a story around it and once the KPI is understood and used then invest in the reporting automation
    • Loss of trust in data quality early on is hard to regain
      • Test data quality before using it
      • If necessary, modify the KPI to rely only on quality-assured data (even if means the KPI is not ‘perfect’)
  • Tech stacks and data regulation are key factors to consider
    • Active tracking consent (part of GDPR) may lead up to 40% data loss (on average)
    • Makes it very hard to correlate marketing and onsite data
    • Do not start with what’s technically possible, rather start with the human behaviour and build KPIs around it; only then consider the technical side
  • Marketers must understand the technology behind both digital marketing and data collection, down to a granular level (particularly true for digital pure players)
  • Decision makers need to understand how a KPI is calculated
    • But also need to comprehend how data can change over time when metrics are re-calculated
    • Distinguish metrics from KPIs and ensure management understands and is accepting of the KPIs
    • Less is more!  Keep the number of KPIs small
      • Other metrics are operational measurements and proxies for performance
        • Might be defined as secondary or tertiary KPIs
        • Also known in some orgs as health metrics


  • Good KPIs (Marketing KPIs no exception) separate real signals from noise
  • Analytical experience is helpful for understanding KPI development
    • But humans are prone to cognitive bias (looking at the same reasons) and are slow to analyse large amounts of data for patterns
    • When troubleshooting KPI issues, leverage your peers for further insights and perspectives
  • Machine Learning, especially anomaly detection, will be of great help here if your data is valid end-to-end. Otherwise you will open up another front of problems
  • As more and more marketing decisions are automated by ML, a new set of KPIs may be necessary to monitor automated decisions and the input data quality

14:45 Virtual Coffee Break

15:00 Gold Keynote

Automated data integration allows you to improve operational efficiency as you significantly reduce the amount of time spent on the data and spend more time on generating analysis and insight to drive business optimisation.

In this short session we will cover:

  • How the analytics landscape is changing
  • The impact of automation

Come away with a clear view on how automation can help you transform your analytics practice.

15:10 Round Table Discussions - Round #2

Each delegate selects and participates in one discussion from below

Round #2
Mon, June 15
15:10 - 16:30 CET

Sid Shah
Conde Nast

Data & Insight teams are critical at any large organization. Reports, dashboards and insights produced by these teams help the business better understand customer behaviors and support decision making. Nevertheless, even with the best tools, great analysts and smart KPI’s we often struggle to drive engagement. Rather than blame the business let us discuss and share methods to improve engagement with the non-analyst part of our organization. We will discuss:

  • Effective data storytelling use cases
  • What are the different approaches and techniques that drive stakeholder engagement? How much is subtle improvements in data visualization compared to insights narrative?
  • What is the right balance between data preparation vs story telling? How do we measure success?
  • Do analytics teams have the required data storytelling skills? Where are the gaps and how can those be overcomed?

Come share your experiences and views so together we can tackle one of the fundamental challenges we all face in analytics.

Key highlights:

  • The key to effective data storytelling is to understand the customer/audience needs, purpose and questions they are trying to answer
  • Simple charts, design, tools and adding text are good tactical changes. However, the output should be relevant to your stakeholder and convey the message
  • Domain expertise and technical skills around the tools & technologies of choice is must skills to have to provide context.  The other important area to focus is people and influencing skills, an ability to communicate the message effectively
    • The group discussed and agreed both data analysis and people skills as required for growth
  • Collaboration with stakeholders and to understand business impact based on the work is also an effective way to measure success for data storytelling

Andrea Mestriner

A recurrent theme across the analytics community is finding the right mix of people, process and technology. There is no one structure that fits all yet most organisations struggled to strike a good balance. This challenge is augmented for multi-nationals – how to balance regional requests for reporting and analysis with a globally-managed platform?

As the Global Head of Analytics for luxury fashion etailor YOOX NET A PORTER, Andrea Mestriner is constantly looking to get that balance right. In this discussion he will explore with you the following questions:

  • How do you identify the problems worth solving?
  • How do you identify the right people, processes and technology mix?
  • How do you engage stakeholders in the analysis process?
  • How do you collaboratively create/share best practices across disparate team in multiple locations?
  • How much freedom do regional teams should have?

You will come away from this discussion with a set of insights and ideas for moving your analytics programme forward.

While the various team structures are kind of a common sight, it is very hard to create some rules of thumbs to help select the right one and to define it like:

  • Decentralised and embedded does it really mean reporting line can only the function leader?
  • Centralised is the only way to build strong foundations?
  • Decentralised or Hub and spoke once you have reach a level of size and maturity?
  • Probably some function can only exists as centralised like Data science and AI?

Source: Forrester Research’s Digital Maturity Model 5.0

Team structure evolves with organisation’s digital and analytics maturity:

Source: Forrester Research’s Digital Maturity Model 5.0

Source: Forrester Research’s Digital Maturity Model 5.0

Some drivers to select the correct structure are:

  • Size of the organisation and of the teams
  • Roles, skills and capability available and mapped
  • Team location (multi or few/single)
  • Rest of organisation setup
    • Global or Regional
  • Plan and way to access and use insights
  • Cost structure
    • Permanent vs. contractors vs. outsourcing
  • Real data-driven organisation requires a central structure that provides capabilities, standardisation and support while decentralised teams manage day-to-day
  • Is simpler to move in one direction (centralised to X) then the opposite
  • Consider the challenges that every structure creates or resolves such as:
    • Knowledge sharing
    • Failure point problem
    • Data product development
    • Career framework and progression
    • Innovation
    • Data quality and governance
    • Tools proliferation hence costs
    • Data culture
    • Silos mentality
    • Team management style (micro vs. macro)
  • Management buy-in is key for success
    • Need to get senior managers onboard and actively supporting otherwise your initiative could get derailed

Kevin Holler

Data-enabled businesses place as increasing importance on having a reliable analytics stack as they do on having the data itself. They continuously refine their infrastructure to support their analytics programme and advance their competitive edge.

While the components of an analytics stack are becoming more specialised and modular they are also getting simpler to set up, easier to manage, and cheaper to scale. However, doing so successfully still requires knowledge and effective process management.

In this huddle, led by Kevin Holler of Fivetran, we will address key questions such as:

  • What are the jobs of a modern analytics stack?
  • How to choose the right set of tools for your stack?
  • How automation could help you scale your stack effectively?

Come share your analytics stack use cases and hear from other leading business how they are scaling their analytics stack and the benefits they are experiencing.

Key highlights

  • Kevin started the he discussion with a start-up use case where they were collecting a lot of data in Postgres and using Looker for analysis and visualisation
    • But crashed Postgres when tracked over 10m events per day
    • Had to find a solution to scaling the infrastructure
  • One delegate shared experiences of pushing GA data into BigQuery through a data warehouse so can merge digital analytics data with other data points
  • Another delegate shared how they are continuously developing integrations in order to satisfy stakeholders' requirements
  • So once the data model was established you were able to service other stakeholders too
  • A large retailer delegate explained how their org is outgrowing their data warehouse so need to move the data to the cloud
    • Creates issues for analysts as need to continuously monitor the data models for each tool used
  • Difficult to get data sorted into a database
    • Fivetran have automated connectors
      • Simply have enough engineers to ensure the connectors are constantly updated and if they break then dedicated engineers can fix the problem and restart
      • Historically companies did not like to lose control of their data but now they are shifting to cloud solutions
        • Security improved as well as companies understanding of cloud risks
      • Clients are also becoming more comfortable with vendors managing their pipelines
  • Pipeline management is very time consuming
    • A good pipeline will ensure the data is flowing but must still have the people to ensure the data is correct
    • For that large retailer data pipelines are so structured now so no longer a challenge
      • They are piping a ton of relevant data to a warehouse but still not making full use of the data
      • The bottleneck has shifted from the data collection challenges to the data modelling and analysis challenges
        • This was a key point made during the discussion which which resonated with all delegates
      • Not enough analysts and engineers to deal with the available data; not enough skill in the teams to harvest the data and create valuable insights
      • Data engineering must build customer user objects (extremely time consuming) based on all the data from the data model
    • Many tools offer connections, but the devil is the detail of the connector to ensure the data is accurate
    • The purpose of the data is a key factor
      • ‘Activation’ (i.e. targeting, personalisation customer service)
      • Analytics (i.e. reporting and insight)
      • If the former than more likely to get prioritised as easier to justify financially
  • Analytics is a harder sell; Much easier to justify spend on marketing but not necessarily more valuable
    • This is a common failure of our industry to not be able to prove that to management
    • One participant shared that whilst working for a retailer years ago they used TV to attract people to the website. But no one was looking at the data of how TV has driven value for the brand. He started looking at the underlying data from a curiosity PoV and found great value; now this is very commonplace
      • So perhaps we need to wait a few more years until the value of data will be fully realised / appreciated
  • General agreement that the data pipelining and data storage challenges are solved / near being solved but the analytics at the top is still an issue
    • So the industry should focus its efforts in solving for those downstream challenges as it has in developing better tools to measure and track
Hello Fresh

A data strategy is a must for any company these days. It helps ensure that data is managed and used like an asset. But data sources are constantly changing creating a challenge to maintaining high standards of data quality, transparency and accessibility. At the same time, that change, is potentially the number one advantage for business growth in a competitive environment. So how can the applied dimensions of a data strategy look like and how do we get the most out of them?

In this discussion, Tim-Fabien Pohlmann, Director of Marketing BI at HelloFresh will explore how to devise a great data strategy, implement it and ensure the organisation adheres to it. The aim is to have relevant data, which is fully governed, available, accurate and accessible in a timely manner when we need to query it.

Questions we will answer include:

  • Why do organisations struggle to maintain useful and accurate data?
  • What are the successful use cases and what are the key elements supporting them?
  • How do we ensure measurement is considered up front in every new initiative and where should accountability sit?
  • How to maintain consistency across products and business units?
  • How do we ensure data governance is prioritised, when competing with other initiatives which might appear to have more immediate customer benefit?
  • How to implement monitoring, alerting and anomaly detection to prevent data loss/corruption?

If you strive to build trust in an elaborate data environment – this session is for you.

  • Pyramid view Data Strategy is composed of Data Governance and Data Management
    • Put a price tag on the assets that need to be governed
      • Build up the data sets - align with the data management
      • Gather the foundations
      • What, how and why you want to achieve it
    • Business Value Layer needs to be seen - the levels here:
      • Vision of the insight function
      • People Part - inhouse or externally
        • How to get buy-in to the governance?
          • If you show the value of data
            • Show potential value the company can do in the future - profit potential
      • Ways to make it work
        • Who should drive the DG initiative?
          • Tech or Business?
            • Idea: to go from both sides
              • It must come from way up in the organisation
                • SLOs and SLAs are important ingredients to clarify also ownership and responsibilities
                  • A supervisory board / team is important to keep all of this in place
                • Also needed: a management system / tool where all new joined company members need to be registered and read through the data governance documents
        • What is important to do?
          • Documentation is important
            • For an assessment of the groundwork - do we understand the foundation of our data?
            • Set up documents and contracts
            • Find internal sponsors and top management for this task
            • Privacy is important as well
Asana Rebel

Marketing clouds are commonplace these days. The concept of a SaaS integrated solution is incredibly appealing to businesses. Cloud solutions offer greater flexibility and accessibility alongside better processing powers – data could be actioned faster and more efficiently. However, cloud solutions throw new challenges for analytics professionals.

In this discussion, led by Ole Bahlmann of Asana Rebel, we will look at all the aspects of building an analytics system in the cloud – from selecting a cloud vendor (e.g. AWS, Azure Or Google CP) to discussion around appropriate configurations (e.g. memory vs. compute) to tools to maximize governance, data intake, ETL, and analytics.

Come prepared for a discussion that will cut through the marketing hype and touch upon the real challenges of a single vendor stack in the context of delivering superior analytics insights and driving ROI.

Key highlights

  • Which tools make your analytics stack:
    • The big three – Google, Amazon, Microsoft

    • IBM also offers a cloud tools

    • Databricks

    • Other SaaS tools part of this discussion -> yes
  • Agencies/freelancers are tied to the client’s stack unless the client is bringing them into to develop a stack
    • If client has a stack preference then that is likely to influence the selection of the agency/freelancer
    • Otherwise, might look for someone agnostic with experience across the board
  • Generally, no huge differences in cloud platforms. Key selection factors:
    • Cost
    • Performance
    • Required maintenance
    • Available skills
    • Some orgs are reluctant to pick a platform by a perceived competitor. Concerned their data will be used to compete with them. For example:
      • Ecommerce orgs might prefer Microsoft Azure over Amazon
      • Media orgs might prefer Amazon over Google etc.
    • This can change over time
      • Client tech maturity --> perfect today ≠ perfect tomorrow
  • Who owns the decision and later the tool?
    • Used to be IT but no longer always the case
    • Analytics / data team. Sometimes that results in (part of) IT reporting into Marketing
    • Tools can drive change. IT rarely drives change
  • Who is responsible for data governance?
    • GDPR/privacy is not in the scope of cloud tools
    • Needs to be solved on a higher level
    • All platforms can comply. But remember: no PII in the cloud!
  • Data security (access, not loss) is an issue in the cloud
    • Used to be an IT department challenge (see above)
    • Now a more broad challenge --> shared responsibility?
  • Wishlist for the future:
    • Infrastructure as a Service (IaaS)
    • Cross platform standards
      • Currently solved by 3rd party services such as anthos, hashicorp
Bourne Leisure

…sang John Lennon, parodying Bob Dylan. And in many ways, he was right. In our increasingly data-enabled world, without significant headcount it is near-impossible for digital analysts to keep pace with stakeholder insight demands.

Often this challenge leads to analyst frustration, feeling that they are a service centre for ticketed requests. Equally frustrating for stakeholders; eventually leading to loss of belief in analytics’ ability to deliver valuable insight.

Self-service is a common solution which reduces the analytics team’s workload whilst increasing stakeholders’ access to insight and speed to that insight. If implemented effectively, it will also stimulate stakeholders to ask more meaningful questions of the data.

So how do we enable and advocate an efficient self-service model?
In this discussion we will:

  • Define the possible self-service models – is it a one-size-fits-all definition? Or do we differ by stakeholder group?
  • Review analytics team structures in a self-service world
  • Talk about the necessary tools and tactics to get it right

And, along the way, not forgetting to consider what can go wrong!
If you are considering self-service, have implemented a model successfully or failed to make it live up to expectations then this discussion is for you. Start improving your analytics ROI immediately following the conference.

What is self-service?

  • Many different ways to look at it
  • Less about service, more about self-sufficiency
    •  Give the right users the right data at the right time --> do not give unfettered access --> enable stakeholders to do their job with the data they have access to
  • Stakeholders should be able to talk about the same data without using different metrics for the same measurement (e. g. different metrics for revenue)
  • Marketing BI Team/Engineers have more time for long-term projects and to deliver more value to the company --> do not just want to provide data
  • There is often a large knowledge gap between data and business experts --> create a community to bring those two groups together --> have one common dictionary everyone is using

Company size plays a key role

  • Data literacy is really important
  • Data literacy: what is required to get these people to do their job? They will not be able to explain which data they need, but you would know which data to use to answer their questions --> very often stakeholders are not asked this question and end up with too much information (rather than insight)
  • The larger the organization, the more you need to pre-define/create static reporting --> you cannot train everyone

Differentiation of proposition by stakeholder groups

  • Depends on how the decision-making process looks like: do people make their decisions daily, weekly, monthly?
  • Who is the decision-maker? Which decisions do they make?
  • Rather a top-down report
  • Identify people within different teams with good data literacy
  • One participant shared that in their company there were varying levels of data literacy; they built e-learning training sessions; prior to that the data team had to deal with stakeholders asking many questions --> too time consuming
  • Another participant shared that their analytics team used to send out dashboards that stakeholders didn’t use; now, stakeholders have access to self-service dashboards and can have a look at it when they need the data --> give people the right data at the right time
  • One suggestion was to work with stakeholders to understand their needs prior to creating reports for them. What might look significant and valuable to you might not be to your stakeholders
  • Certain stakeholders will demand certain things
  • Some stakeholders want metrics on their report even if those just pollute the dashboard
  • One participant shared that his team provided stakeholder extensive access to the data. This resulted in over 1,000 dashboards, many if not most, redundant; you must have a clear view on which dashboards stakeholders are using to server them better
  • There is sometimes too much data and people cannot deal with it --> you need to find the data that is important to the people when they need it
  • Important to acknowledge that self-service is not for every company
  • Sometimes, people just try to get data to back their decisions --> people need to know it is the right point to look at data
    • Requires cultural accountability and some relatively smart people


  • Train stakeholders to become analytics power users
    • Was part of their annual objectives; they had to fulfil their role as power users
    • Any stakeholder taking business decisions based on the data should be responsible for the consequences if there are any errors

 How might tools change the game?

  • One participant shared how their company moved to snowflake and looker
    • More pre-configured dashboards for less advanced users
    • Not only web data, but also offline (bookings, cancellations…)
    • They bring all data into one place
  • Stakeholders need to familiarize themselves with all other data sources --> digital stakeholders should familiarize themselves with offline data (at least know which kind of data is available) --> see how other people use data

How have people dealt with communication of this change?

  • Put giant dashboards on a screen (leader board for salespeople)
    • Best idea they had to make people to look and understand data
  • Include some gamification
  • Give stakeholders a sense for the pulse of business to help get buy-ins

16:35 Sponsor Demo Rooms

16:50 Summary & Prizes

17:00 Happy Hour

17:30 End of Day

Tuesday, June 16th

9:50 Virtual Lounge (checking your tech & meeting other delegates)

10:00 Opening Welcome

10:10 Platinum Keynote

Organisations need end-to-end solutions that combine data processing with machine learning capabilities. These streamlined solutions will simplify workflows, improve efficiency and ultimately accelerate business value.

Therefore organisations are moving data and its management to the cloud because the cloud’s benefits apply directly to data management, namely elasticity for high-performance analytics processing and big data scale, but with minimal administration and entry costs for cloud platforms and tools. These benefits contribute to the success of business-driven programs in analytics, data science, reporting, warehousing, and time-sensitive operations.

However, achieving these business and technology goals depends on creating a practical, sustainable, and optimized architecture for cloud data and its applications across BI, data science, and machine learning. There are many layers in a modern cloud data architecture, but two layers stand out because they determine success or failure: the cloud data platform and cloud data integration. In this session we'll discuss how these two layers work together to create a successful modern cloud data architecture.

10:25 Round Table Discussions - Round #3

Each delegate selects and participates in one discussion from below

Round #3
Tue, June 16
10:25 - 11:45 CET

Digital intelligence solutions continue to evolve. Vendors such as Google and Adobe may dominate the market, but other players such as AT Internet, Snowplow and others are carving unique niches solving tangible client challenges.

Join James McCormick, Principal Analyst at Forrester, for an exciting discussion where we will look to answer these questions:

  • What are the key components of a digital intelligence platform?
  • What differentiates the leading providers?
  • Where is the market heading and how should you prepare yourself for that change?

We will draw upon the group’s collective experience to map out a clear view of the digital intelligence landscape and successful use cases.

What are the key components of a digital intelligence platform?

  • Some defined digital intelligence platforms as those with tracking capabilities combined with the analytics capabilities
  • Other members 'feedback loop' which is the system learning from the past and changing things
  • Classes of digital intelligence technologies: Engagement Data Hubs, Digital Analytics and Experience Optimization
  • Forrester defines digital Intelligence platforms as those the merge all the above capabilities. i.e Digital intelligence is about injecting the right actions, decisions and experiences into decisions, actions and experiences at the speed of business and digital engagement. Using data and analytics.
  • Big vendors have built - or are building complete stacks
  • Some vendors are specializing in parts of the stack and focusing on best of breed.  For instance, digital analytics platform, CDPs, experience optimization platforms

How are firms leveraging digital intelligence technologies / platforms?

  • For some vendors it is very challenging to understand how clients will evolve the use their platforms
  • Advanced digital intelligence firms like Uber have custom build their own stack before vendor product offerings were available.
  • Many companies still using the traditional data collection and have yet to mature their digital intelligence practice.
  • Today, it can take years for firms to build their own platform. That is too long. It is much better looking for off the shelf solutions augmented by custom built.
  • The immediate challenge is proving the value of analytics.
  • GDPR and other regulation is pushing firms to take a platform approach. They feel it will alleviate concerns around data control

What are some challenges that exist with using multiple digital analtyics technologies?

  • Many firms use multiple analytics tools that appear to track the same thing. E.g. Google Analytics and another enterprise digital analytics tool (e.g. AT Internet, Adobe Analytics etc.)
  • Understandable (although not necessarily best practice) why the above is happening: Its typical to see the agency and advertising team pushing for GA because it integrates with AdWords etc. and is free; while marketing and product teams push for AT Internet for enterprise capabilities and support.
  • There can be a challenge around having more than one tracking tool; and can be seen as bad practice because it creates two sources of truth.
  • Use of multiple analytics tools doing the same thing is potentially a sign of the analytics leadership's weakness vs. a strong marketing team
DHL Group

Most companies consider themselves or at least outwardly claim to be “data-driven”. Yet too many deliver sub-optimal customer experiences and outcomes.

Till Büttner of DHL claims that having a fancy dashboard does not constitute being data-driven. In fact, being data-driven requires a deep cultural change alongside well-configured and integrated processes and tools. And that takes time. So what should we do in the meanwhile?

In this discussion, Till will challenge us to share our success stories as well as challenges in establishing a data-driven culture. We will look to answer these questions:

  • How can we ensure stakeholders seek data-led insight before making decisions?
  • How do we harness senior management to support a data-driven program?
  • What are the practical ground up steps to use for developing that program?
  • What org structure and set-up works best to effectively integrate analytics into all decision-making?

Expect to come away with a handful of practical new ideas based on the successful experiences of your peers.

Data enablement requires a culture in which data is promoted:

  • Data democratization is needed but must be accepted by all
  • All colleagues need to trust each other and the data
  • There a three key ‘Lacks’:
    • Lack of awareness – Storytelling helps to promote the data and makes it visible
    • Lack of ability – Training helps to have more capable users
    • Lack of desire – An intrinsic motivation helps to encourage the desire for more
  • Look to break each one of those Lacks to move towards data enablement

 Are we analysts really data-driven or are we disillusioned?

  • Eat your own dog food -> A Manifesto for Radical Analytics by Stéphane Hamel
  • Stay reliable, do not fake
  • Train the newbies, it helps to identify your own gaps (question the "always been there")

 "Guerilla" tactics which may help:

  • Start internal campaigns about data which exists / can be worked with
  • Create success stories and promote them
  • Sometime it helps to print stuff to get attention. E.g. it shows what data is available
  • Start lighthouse projects to get attention
  • Create Sniper Reports --> Small reports about findings, which could go to management, if project owner do not wish to cooperate
  • Train the upper management secretly, so they do not get exposed in front of all colleagues
  • From the book "How Google Works" – have a second screen/laptop in meeting rooms to directly look at the data. Or always have an analyst with you in meetings
  • Be prepared to give background information, whenever you can
  • Use the right way and format to present the data to the audience
  • Find the pain of the audience and solve it to get a data warrior (out of the audience) on your side
  • Data meetings – start every meeting with the needed data
  • Gamification to work with data. Let colleagues earn points/badges, the more they work with data
  • Build an open community of people who like to work with data
Asana Rebel

Many, particularly non-analysts, might consider growth analytics and product analytics as one and the same. Both disciplines examine customer behaviours and share common metrics (e.g. revenue generated, purchase and churn rates, repeat visits and retention rates) but serve different purposes. The main purpose of growth analytics is to analyse the health of the business whereas product analytics is particularly concerned with measuring engagement, informing product development and improving customer product experience.

Ole Bahlmann, VP Data, IT & Customer Care at Asana Rebel has vast experience using both disciplines. In this session, we will discuss and debate how to make the most of product analytics including:

  • Use cases for improving product retention, customer experience and reducing churn
  • What are the essential metrics and KPIs for product analytics?
  • How best to manage the relationship with product team/s?
  • What should your product analytics tool stack look like? Do you need a dedicated product analytics platform?

We welcome practitioners using GA/Adobe Analytics/other for product analysis as well as those using (or looking to use) a dedicated tool. Expect to come away with a fresh perspective on product analytics.

Key highlights

  • Surprisingly, the definition of Product Analytics was not uniformed across the group. Focus on:
    • Customer journey, customer experience mapping
    • Processes
    • IoT devices and data
    • Marketing data plus physical data
  • Could include customer feedback through NLP in one example
  • But agreement on Customer Centricity and as contrast to Marketing Analytics
  • Understanding Customer is core of product analytics (but definition of customer can vary between businesses)
  • Some orgs have silos: marketing analytics vs product analytics but generally that is improving by getting disciplines closer
  • Skills/Tools required for product analytics:
    • Curiosity about customers and product (similar as in Product Management)
    • A/B testing
    • VoC (ask questions such as “what could have prevented you from buying?”
    • Internal sources, get insights from co-workers
  • Book recommendation: Customers Included - Mark Hurst

The “build versus buy” decision is a significant one that many companies face when addressing their software needs. Build offers the opportunity to create a perfect fit but comes with risk and often high cost. Buy is normally safer and faster but, in some cases, cannot address specific use cases.

In this discussion, Kevin Holler of Fivetran will navigate us through the pros and cons of data pipelines build vs buy. We will seek to answer the following questions:

  • How does a modern data pipeline framework look like?
  • What are the use cases for build and for buy?
  • How can you make the most of your data pipelines?

Join us for a practical discussion on data pipelines and leave with a more informed view of how your strategy should look like.

The core principle of digital analytics is the transformation of online data into insight and insight into action, generating added value to the enterprise. However, as part of the analysis process an analyst is likely to come across interesting data points without having a clear view of how to act on them. Too much information could lead to confusion. How do your analysts deal with such situations?

Come share you experiences and together we will emerge with solutions to these questions:

  • How to balance between too much and too little data when presenting to non-analysts?
  • What techniques do you use to share interesting insights discovered in the analysis process which is not part of the core question you were asked to investigate?
  • Should analysts always have an explicit objective stated before doing any analysis?
  • How do you allow analysts the freedom to explore in analytics?

Moderated by Sandra Kampmann, Head of Analytics & Insight at ASOS, we will emerge from this discussion with fresh ideas on how to improve our analysis process and on how to deliver insights more effectively to our stakeholders.

Relationship building

  • Teamwork is important. Analysts do not sit in isolation
  • Stakeholder buy-in from the start is vital to building trust and generating insights
  • Some expressed concern over being seen as a "data reporter"
  • Regular contact with stakeholders is helpful. Data alone will not help you make your point
  • Creating one centralised insights team / single source of truth across multiple disciplines (e.g. digital analytics, BI etc) helps
    • Crystallise the business' view on data and how it is used
    • Encourage team to champion each other across the business
  • Soft skills are vital to building trust and developing relationship

How do people feel about the importance of data competency, literacy and accuracy?

  • Data usage across the business is growing but slowly for some
  • The more data literate stakeholders are, the easier it is to drive adoption within the organisation
    • Educating the stakeholders is important but needs to be done in context of each area of responsibility
    • Do not tell stakeholders what you do, tell them how you can help them
  • Presenting findings is an art-form
    • Some organisations have adopted a peer review process for presentations
    • Presentations are digested then audience has to say what they think the central message was. This is then compared with what the author intended
  • Always put your summary at the start of the presentation (and at the end)
    • Especially helpful if you are presenting to senior management who are generally only interested in the headlines
  • Possibility of giving training on dashboards for example through videos

Exploratory vs reactive insights

  • Predictive analytics
  • Block off the time to allow team members to work on what they would like
  • Move away from least impactful 25% of work and spend 20% of time working on what you like
    • To make it work is not easy. Make it a departmental goal
    • Do not measure the output of the 20% because it takes away from the impetus
  • What decisions will you make differently based on this data output
Hello Fresh

How to build a truly effective analytics programme in a fast paced volatile, uncertain and complex environment? How to establish a strong team with a high degree of diversity from different domains and cultural backgrounds? At the same time, the team must develop and maintain a high level of credibility with other parts of the organisation in order to be (a) heard and (b) given the necessary funds and support to implement and run the programme. These are challenges faced daily by Tim-Fabien Pohlmann, Director of Marketing BI at HelloFresh. In this discussion, he will encourage delegates to share their views on and experiences in designing an organisational structure that supports their business mission. We will discuss how to structure the programme and develop it over time; ask what skills and roles are essential from day one and which could be added as the programme evolves? And how should the analytics team be integrated into the larger organisation?

We will discuss how analytics managers can excel in this uncertain, complex and ever changing environment:

  • Cultural diversity in growing teams
  • How to pivot your team from small and multi-functional to large and specialist-focused?
  • What process adjustments are required as your team expands?
  • How best to nurture talent in an increasingly divergent team?

We invite both analysts and managers to join the conversation which will produce new insights practical for all.


  • Good practices here - analyst / BI person
    • Be with us for some time
      • Questions to ask:
        • How long do I have this need for that person?
      • Recruiting the best person might not be good considering the runway we can offer in terms of career progression
        • Social aspect is also important - Friday drinks is important
      • Personality types
        • Career oriented person
          • Without a career plan they leave
        • Unsecured professional
          • Main motivation is peer recognition
          • Do not make this person a manager
      • Two tracks
        • Managerial and Analytical side
        • Do not compromise on recruitment
        • Three stage interview loop
      • Recruitment
        • CV screening
          • Not interesting letter of recommendation
          • CV together with letter of motivation gives a good overview
        • HR partner is important and can reduce the time to recruit
        • Ask around - what blogs do you follow
        • Phone call is important
        • SQL - technical test
          • Talk about advanced techniques
          • Comments
        • Full day onsite with behavioral
        • And product experience is important
          • What is harder to train
        • Post recruitment testing
          • Probation time checks
        • Reference persons can work well
          • Try!! Ask the person the following question:
          • Would you hire that person again if you would have a second time?
        • Also can you imagine having a glass of wine

Onboarding and integrating to the organisation

  • Take care of language
  • Prepare a work document:
    • Here is what I expect from you in the first weeks and here is what I expect in the first month
  • Differentiates in seniority of the new joiner - more peer setup
  • More Junior it is hands on onboarding
  • Peer reviews need to be in place especially for new hires
  • Professional plan for the first twelve months
  • Prepare a technical skill survey
    • Here is where you are now
    • Here is where you want to be in 12 months
  • Assign a buddy to the new joiner

Integrating the new joiner to the team

  • Review / Retros - how to make sure that results will be the same for each analyst
  • Leave the flexibility to go different ways
  • Discuss best practices
  • Give everybody the same assignment and compare the results
  • Make sure why we are doing this? What is the purpose of the whole work?
    • Create an analytics playbook - describe the why
    • Value for the organisation is higher than some other things the analytics team does

11:45 Virtual Coffee Break

12:00 Gold Keynote

As Simon Pilkowski, Global Lead Digital Performance Management at Boehringer Ingelheim, likes to say: "A fool with a tool is still a fool, but a fool with a process at least knows what to do next." And establishing a foundational, repeatable process is critical when governing marketing technology and data at scale.

Join Simon and Tim Baird, Director of Sales at ObservePoint, as they walk through how Boehringer Ingelheim fully automated their digital governance processes across more than 550 websites. Through the steps in their example, you will learn how to:

  • Establish automated data governance across your digital properties
  • Manage MarTech at scale throughout website mergers and updates
  • Govern data collection for all your properties on an ongoing basis
  • Use data governance to comply with privacy regulations

12:10 Round Table Discussions - Round #4

Each delegate selects and participates in one discussion from below

Round #4
Tue, June 16
12:10 - 13:30 CET

Andrea Mestriner

A recent McKinsey report found that customer focused organisations are 23x more likely to attract and 6x more likely to maintain customer relationships. Doing so successfully requires a combination of technology, processes and organisational focus on the customer. Customer Data Platform (CDP) promise to deliver on the technology side to things.

However, implementing a CDP is a significant financial, resource and time undertaking. Most organisations would demand a quick return on such a commitment.

Luxury online retailer, Yoox Net A Porter, is in the midst of a CDP implementation project. In this discussion, led by Andrea Mestriner, Global Head of Analytics, we will discuss:

  • The essential conditions for initiating a CDP project
  • What are the short-term business use cases (to justify the investment)?
  • Who should be involved in the project? How should the selection process look like?
  • What organisational changes are required beyond just the tech stack to make a CDP project a success?
  • What are the long-term use cases?

Are you considering a CDP? Do you need to get management buy-in? This will be a practical conversation designed to give you a good view of the critical aspects of CDP so you can go back to your management with rational view of the pros and cons of such a project.

Join this discussion and walk away with a list of ideas to demonstrate value add immediately following a CDP implementation.

Definition of Customer Data Platform (CDP)

  • Some confusion in the market about what it really means
  • Multiple interpretations:
    • A data collector and processor to help marketers and simplify data engineers’ lives
    • The above plus capability to democratise ML/AI for segmentation and user profiling
    • All that plus a channel orchestration solution
    • All above plus an analytics platform

Process considerations

  • Organisational needs are the obvious starting point
  • Next consideration is whether to build it in-house or buy off-the-shelf solution
    • Main pro of building it in-house is integration with your ecosystem but support and long-term effort could be bigger than an off-the-shelf solution
  • Have a detailed scoping document from the outset
    •  Will save you enormous time and pain when building / selecting a vendor    
  • Privacy
    • Define what data you wish to load into the CDP
    • Run it against what you could from a privacy PoV; consult your DPO or a specialist privacy lawyer
    • Consider how to manage and upload your first party data
    • Outline the relationships between the data and the courses of action you will take (i.e. real-time online targeting, marketing optimisation, call centre data feed etc.)   
    • Define how to set and manage identity resolution especially data stitching if anonymous data is also used
    • Must have a clear view on how to identify, flag, encrypt and be able to delete any PII to support GDPR
    • Design access policy and environment setup
    • Be very careful about what you do with the data in a CDP

Teams & Users

  • Be very clear about which stakeholders will have access to the data in the CDP
  • Create a community around these stakeholders as they are the ones that will help you demonstrate value and drive results
  • Foster sharing and collaboration between these stakeholders
  • Most of the time it will have to be managed as a change management project


  • They are different as a solution and from a legal standpoint
  • DMPs are still here to stay given the level of investment sunk into them and the use cases they support

To keep pace with the speed of change -- business, customer and technology -- we must leverage our community, internal peer groups and external networks. Building an analytics community at your company brings together like-minded people across organisational silos to solve business problems. Your external network provides a forum to develop thought leadership, help others and be inspired.

In this discussion, led by Michael Böhme of Zalando, we will examine the following questions:

  • What are the most tangible benefits for creating an analytics community?
  • Which approaches for introducing an analytics community work best?
  • What are the quantifiable and intangible benefits of such a community?
  • By its nature, a community structure is less hierarchical – what format should it take and how do we keep it moving forward without formal governance?

Join us for a stimulating discussion about a new approach to sharing analytics knowledge, and keeping your talent stimulated and engaged with other analytics functions in your company.

What are the most tangible benefits of an analytics community?

  • Social
    • Networking events to further develop relationships and meet new people
    • Have some fun with peers
  • Speed
    • Leverage “swarm intelligence”
    • Explain KPI drops and spikes faster
    • Avoid duplicate work; sharing trade solutions
  • Quality
    • Get peer feedback about new approaches or tools you are using
    • Learn from colleagues how they’re solving for similar challenges
    • Exercise presentation skills and writing skills when analysts speak at meetups and share articles and papers
    • Analysts look for career growth when contribution and acquiring new knowledge are valued at their company  
  • Events do not equate to community –-> requires leadership and continuity
  • Central analytics cannot handle everything. Communities can help with having contacts to forward questions (multiplier)

What are the unquantifiable and intangible benefits of such a community?

  • Broaden perspectives allow to improve capabilities and skills, keep analysts motivated through learning opportunities which pay into career growth
  • Feeling valued and appreciated through sharing
  • Diversity and inclusion by providing a platform for everyone
  • Safe environment to experiment, propose new ideas and develop
  • Keeping in touch with former colleagues who may switch positions / companies
  • Feeling part of something bigger
  • For Zalando having a community, recognized outside of the company increases its analytics brand, thus increasing the overall standing and competitiveness of Zalando’s analysts
  • Help in crisis with advice and even job opportunities
  • Meeting people who share the same challenges

Which approaches for introducing an analytics community work best?

  • Starting out:
    • Requires a concentrated initial (mainly time) investment to get things rolling
    • Nominate/elect a dedicated community manager
      • May rotate regularly to reduce workload but also to engage others and give them an opportunity to shape the community
      • Look for people with intrinsic motivation
    • Agree regular recurring meetings (set them in the diary!)
      • Ask subject matter experts to offer first sessions
    • Established a Slack channel for self help
    • Showcase your work in a demo place, where people can pop in and ask questions
    • Need to get visibility --> add an “analytics corner”; spark conversation
  • Consider creating a logo together
    • Need not be professionally designed
    • Creates identity, exclusivity, sense of belonging and pride
    • Make printed goodies and laptop stickers
    • Create an own brand internally unique activities (e.g. fancy-dress day)
  • Build the community up based on analysts’ input. Use surveys to inform direction
    • A sounding board of several people with diverse experiences which provide feedback on new initiatives (like an advisory board)
    • Start with Alumni network, “legends”
  • Offer trainings so people get together then put people from different trainings together
  • Work together to truly understand stakeholders’ pain points
  • Too wide vs. too narrow
    • Maybe start open and split up to sub communities
  • Budget / finances
    • When working for start-ups ask your VCs
    • Talk to vendors. They could have a vested interest in supporting relevant communities. For example, Google supports many local events
    • Instead of pay raise, negotiate a learning budget
    • Ask for sponsoring events internally

Community structure
By its nature, a community structure is less hierarchical – what format should it take and how do we keep it moving forward without formal governance?

  • Create win-win situations: Zalando made community contribution part of the analyst role description
  • Have contact persons, providing templates and peer reviews from seniors before sharing something company-wide; consequently, building up juniors’ confidence
  • Offer benefits for different job grades:
    • A junior may want to learn
    • A head-of may want to promote their department and open positions
  • Zalando approach 2020 due to Corona:
    1. Start with focus on the digital community on the Intranet, share one article from a team per week and build up pool of content and people willing to participate
    2. Measure article engagement through likes, comments and sharing
    3. Conduct quarterly surveys: “Do you want to hear more from XYZ and get to know them?”
    4. Organize several small meetups around most liked topics or methods
    5. Organize annual internal offsite events picking up most liked meetup topics
    6. Organize ~biennial external events with top internal speakers

What about external communities? Do you have some examples or tips where to meet experts and exchange?

  • Measure Slack, Web Analytics Wednesday, Measure Bowling
  • Digital analytics meetups & MeasureCamps in your local area
  • Look to combine internal and external communities
    • Have members of each community present/share in the other to expand knowledge share, bring the two communities closer and enrich both

Businesses have one thing in common – they should be focused on the customer. This means their success is based upon removing the blinkers, expanding beyond a transactional view and understanding as much as possible about their customers. Who they are, how they behave beyond their interaction with the business, and how to get more of them. The challenge, however, is that a single brand or business is in the position of providing, at best, a handful of products or services to any single customer which greatly limits the scope of what can be observed directly.

Burberry is placing an ever-growing emphasis on understanding and serving their customer. In this discussion, Ben Stephens will focus on known customers or prospects where there exists a degree of direct contact – comms data, transactional data, physical interactions in a store. We will explore the different methods and opportunities for improving the quantity and quality of data held on the customer covering anything from smarter/progressive data capture to good old-fashioned buying it. Whenever we touch on the customer there will inevitably be privacy or ethical considerations however that is not the key focus of this discussion.

Finally, we will address the critical question of how we can actually use what we know for the benefit of the customer and not resign the most valuable data to the confines of a PowerPoint deck.

How well do we know the customer?

  • Typically, an impersonal view of a number or ID – can be born from system definitions or obscuring PII where it is the only way to work
  • Can reduce customer to list of sales targets
    • --> Callout for online gaming being good industry for recognising customers
  • More problematic when an account is not a 1:1 relationship but represents a household or group (subscription)
  • Knowledge of a customer often resolves back to value (lifetime or potential marketing value)
  • Often inequality in understanding of customers locally vs globally (global = centralised point or head office)
    • Global view often leads to lack of understanding in local customer activity where different laws/legislation so same knowledge of customer but enables different levels of interaction and outreach
    • Additionally difference in knowledge based on direct contact with customer – head office ‘digital’ or ‘data’ view only however store staff can see and speak to the customer often leading to a different but uniquely valuable view of the customer
    • Ditto Customer Care – direct customer contact even in digital businesses
    • How to link/benefit from both is a challenge
  • Customer can become a rear-view exercise linking an id back to previous activity rather than a forward-looking investment
  • Loyalty schemes an approach to improving and levelling out customer knowledge.
  • Callout useful to ask for more attitudinal/intent based information to better understand customer needs and journeys
  • Difficult to systematically ensure quality or coverage in this area however
  • Key question, can businesses be confident in qual or derived data?
    • Use derived or unstructured qual data to test assumptions and then validate against those where there is higher certainty
  • Can be difficult where alternative measures of customers are pushed – for example online tools push to marketers ‘lifetime value’ metrics but i) only based on cookie lifetime and ii) visitors are not customers
  • Attitude towards customers within the organisation are important as will influence the overall view
  • Key question – what is the value exchange for customers? Why would they volunteer their data or login/identify

How to enrich customer data further/get more & How to use it?

  • Use it to change our customers (better understand who they are and who you want them to be) or use it to change business
  • Trade-off between short term & long term
  • 1st party data needs to be a value exchange and 3rd party data (your customer, a 3rd party platform) is increasingly common but difficult to get hold of
    • Important to embrace 3rd party platforms (examples of WeChat for China over internal email as a comms platform) as it is based on customer choice so harmful/expensive/ineffective to try to battle against in most cases
    • Focus on customer benefit first then data acquisition will be easier in the long term as more likely to have opportunity/journey that crosses you brand 1st party
  • Different considerations when in 1:1 vs 1:many scenario as households/group IDs can send mixed signals so building relationship is difficult given sometimes contrasting tastes/preferences
    • How useful then are demographics based on regions/households? Some useful elements can be derived
    • Cost of making a mistake based on assumptions is high though
    • Use testing to validate inferences with low risk activities
    • Derive or predict only to the point of being useful – so if you are targeting 30-50 yrs. olds based on age then do not need to calculate exact age, just broad segment
  • Predicting or assuming behaviours can be more valuable than deriving attributes as more action oriented

Privacy & legal (ran a little low on time)

  • Generating consent is key – as important as getting information to begin with so a non-consenting sign up is only half ‘complete’
  • Work with consent frameworks to generate consumer trust and benefit in that way rather than trying to circumnavigate
  • Often considerations are different for capturing data vs managing data so be clear on what’s what

We are told that every company is now a data company, and yet we wrestle with a variety of problems related to getting businesspeople to value, understand, and use data in their day-to-day work.

In this huddle we will explore the topic of data culture and discuss what self-service means in the context of our own businesses. Led by June Dershewitz, Director of Analytics at Twitch, we will look to answer the following questions:

  • How do you develop a strong data culture? Is it a top-down mandate from executives, a grass-roots campaign led by data staff, or a bit of both? What tactics actually work to build and sustain a data-driven mindset?
  • Who are the different customers of self-service analytics, what motivates them, how do their needs differ, and how can you best enable self-service for each group?
  • What are the dangers of self-service? What could possibly go wrong once you empower everyone to become their own analyst? Or, to put a positive spin on it, how do we make data literacy pervasive?

If your analytics resources are overwhelmed, you are wrestling with your organization over the adoption of self-service analytics or you are in data-driven nirvana then this is your opportunity to share and learn how to maximize analytics ROI via self-service.

During introduction it became clear that the focus of this session will be with two main aspects:

  • How to foster a data driven culture
  • Do's and don'ts for self-service and how to scale it

Foster a data driven culture

  • A data driven culture needs to be fostered from both sides: top-down & bottom-up
  • Some challenges that might arise:
    • C-Level does not always understand the value right away unless it targets one of their specific pain points
    • Shortage of time and interest within middle management
    • Data does not always fit the agenda
  •  80% of users use only 20% of the solution
    • Focus on that part for self-service scalability
    • Other 20% are most likely analysts or heavy users
  • Depending on the size of the company and the size of your team you might want to have additional strategies for those (i.e. 1:1 training, focus groups, knowledge sharing)
  • For self-service you need a demand from the user’s side. Try to create this demand through marketing within the company and a clear focus on the reason behind management
    • "Sell them the ice cream"
    • A common case for self-service is to release the bottleneck analytics team and free up time so the analyst can create more value
  • Try to create momentum through fostering data usage with specific departments or people. Their success will make other people and departments want to engage as well


Do's and don'ts of self-service

  • Ensure accountability on reports sent out (or stop sending them out)
  • Engage people to use data:
    • Include stakeholders in the report creation process
    • Have stakeholders/analysts workshops
    • Use the domain knowledge of your stakeholders
    • Work with proven use cases to increase engagement
    • Try communities or community tools like:
      • Sharepoint hubs
      • Facebook for business or Yammer
      • Slack channels for self-service
  • Foster data literacy
    • Keep in mind that this is an ongoing process, not a one-off project
  • Think about analytics and self-service in a product ownership context
    • Have personas of users
    • Market to stakeholders
  • Self-service mainly used for ongoing requests but rarely for onboarding
    • Feared to be too complex
    • External or internal conferences or roundtables can be part of onboarding, as well
  • Train-the-trainer works well in small companies but might not work well in larger companies for various reasons
  • Always keep in mind that there are various learning types and people you need to interact with
    • Some love self-service
    • Some will need a 1:1 training
    • Others will be reluctant
    • Invest in each segment above accordingly
      • Might not be worth trying to convert the ‘reluctants’
      • Success with advocates is the best way to convert the ‘reluctants’ 

Today’s consumers are demanding increasingly personalised, relevant communications from the brands they use. The rapid growth in customer data and technological progress make it possible for marketers to achieve true personalisation at scale for the first time, by harnessing the power of machine learning to match the right message to the right customer, in the right context, at the right time.

OTTO has invested heavily into resources for a test & learn approach in personalisation, trying different approaches and methods customizing the user journey offsite and onsite to the specific needs of consumers.

In this huddle, Steffen Methling, Senior Marketing Analyst at OTTO, will discuss progress, learnings and challenges using user data to personalise content decisions. We will investigate the types of algorithms and data that is used to come up with the best message for each customer. We will talk about tactics, challenges, revelations and dead-end streets in automated data-driven marketing decision making.

Make this your personal journey to a leap ahead in user experience.

Key highlights

  • Real Artificial Intelligence not yet on screen in marketing but progress in algorithm-based personalisation
  • Article recommendations and smarter retargeting often first step towards personalisation
    • E.g. “Last seen products”, “other customers like you also looked at”
  • Combination of algorithmic segmentation and business rules a good start:
    • E.g. Purchase prediction to reduce marketing spend on people who will buy anyway and those that will never buy, whatever you do
    • Some participants shared positive experiences with group
  • Persona and psychological segmenting messages potential next step
    • None of the group has tried so far
    • Why do certain people buy and why do some not? Should be one of the first questions asked before initiating a personalisation programme
    • User behaviour often hard to predict and influence, some might not know what they want and some might know too well and might not be brought off course easily
  • Knowledge graph of user characteristics is a good start to process touchpoint data and attributes
  • o    E.g. preferred channel, preferred device, preferred product category etc.
  • Must have reliable, rich and pretty large data set to feed the algorithms
    • Often the question is not what data to add but what data to leave out
  • Consent and cookie management become even more important with personalisation
    • Must be explicit and clear with customers on how you are using their data and how it would help them
  • Whilst the value of personalisation seems very intuitive, it is not easy to measure
    • Unless you can maintain a consistent exclusion segment to benchmark against
    • Only practical if all user log in (e.g. can be done in walled garden environments such as Facebook, Gmail etc.)  
    • Sometimes only click rates improve
    • Better user experience and relevance will likely lead to better Customer Life Time Value
    • Could use customer surveys to try and gauge improvements in experience (e.g. NPS)
  • General note: be aware of the consequences personalisation might have concerning your local GPDR (easily a topic on its own for discussion)
Boehringer Ingelheim

Too often technology solutions become technology problems. The size of your marketing technology stack has grown to rival Everest, and it’s ready to topple over. You need a little help, especially with those cumbersome tags you had to deploy on your site to get your MarTech up and running.

In this session, Simon Pilkowski, Global Lead Digital Performance Management at Boehringer Ingelheim, and Tim Baird, Director of Sales at ObservePoint, will lead a discussion on how to establish the right processes and use automation to govern your marketing technologies and ensure the accuracy of the data collected.

Join the discussion and learn how to:

  • Scale data governance across all your digital properties
  • Manage your MarTech through website mergers and regular updates
  • Comply with data security and privacy regulations
  • Automate ongoing data governance to ensure continued data accuracy

What makes data governance so difficult?

  • Misalignment between people, processes and tools
  • People move a lot both internally and externally
    • Poses a challenge for continuity, particularly for data governance
  • One participant noted that start-ups generally have a high level of data maturity but low people's longevity whereas the very large B2B companies have low level of data maturity with very long people's longevity

How do you effect a process for good?

  • Must have a process but at the same time be able to modify as circumstances change (e.g. change in corporate objectives, integration of new tech etc.)
  • Change in people is very detrimental for the process
    • Good documentation is critical to maintaining continuity
  • Stakeholders get nervous if they do not know how much of the data was audited
    • Better have part of the data fully audited vs. all data partly audited
    • Maintain a record of progress
    • Documentation, documentation, documentation
  • One participant’s company has two people dedicated to data governance + a team dedicated to educating stakeholders on Tableau and SQL
    • Looking at data catalogue tools (e.g. Informatica, Cloudera and Alation) to support data governance programme
    • Data Catalogue provides a single self-service environment to find, understand and trust the data source

Data governance - top down or bottom up?

  • Should be top down but must have the bottom accept and implement it
  • Bottom up does not work in very large-scale enterprises
    • Must have the senior management support
    • Otherwise, initiative is never prioritised and always challenged
  • Governance structure in a large matrix organisation will include a business lead (for Analytics) and an IT lead for the tool/s
    • As with any other project in large enterprises, data governance success will depend on working with all stakeholders cooperatively
    • Particularly true given this is a process not a project
  • How do we evaluate the success of a data governance strategy?
    • Minimise the number of dashboards per metric
    • Increase analytics productivity
    • Reducing time spent on fixing data bugs
    • SLAs could cause a problem because they focus on the process and data but not on the people
      • Without focus on people data governance is just a service provided with low value
    • Often very hard to prove ROI on analytics; even harder for data governance
      • Therefore, look for softer metrics within the org
      • Measure stakeholders trust in data over time via short and simple recurring survey (could be NPS-ish)
      • Will give you a sense of direction which could be shared with senior management as soft proof of progress
      • In addition, consider conducting occasional in-depth interviews with stakeholders to flush out various analytics satisfaction questions including trust in data
      • Survey can run quarterly for all relevant stakeholders whilst in-depth interviews are rotate between stakeholders (so, for example, each stakeholder is interviewed once per year)
  • Management must understand that like analytics, data governance is not a project but a continuous process
    • Seems obvious to us – analytics practitioners. Not so for most others in the org
    • Must educate to explain both the significance of data governance and the continuous nature

13:35 Sponsor Demo Rooms

13:50 Summary & Prizes

14:00 Happy Hour

14:30 Workshop

Tue, June 16
14:30 - 16:30 CET
ABN Ambro

What gets measured gets managed, thus, KPIs serve as the cornerstone for any strong analytics programme…or any management programme for that matter. But are you using the measures that matter most? The wrong KPIs will likely drive your organisation in the completely wrong direction.

Join this interactive workshop with Julia Jukabiec, targeted at both analysts and managers, acquiring the principles of effective KPIs. Julia will walk you through the process. We will then spend the majority of the workshop going through two real-life exercises that will help you practice and internalise the principles shared in the first part. If setting up KPIs and managing reporting forms part of your responsibilities then this workshop will give you fresh ideas and perspectives on setting KPIs.

16:30 End of Day

Wednesday, June 17th

9:50 Virtual Lounge (checking your tech & meeting other delegates)

10:00 Opening Welcome

10:10 Platinum Keynote

What is the best way to execute your data integration tasks – writing your own code or using an ETL tool? Find out the approach that best fits your organisational needs and the factors that influence it including:

  • How data management is changing
  • What is the Total Cost of Ownership (TCO) to being data-driven?
10:25 Round Table Discussions - Round #5

Each delegate selects and participates in one discussion from below

Round #5
Wed, June 17
10:25 - 11:45 CET

Forrester research has studied insights-driven businesses that are gaining market differentiation with synced-up data & analytics approaches which deliver coordinated decisions, actions and experience at scale. By 2021 these businesses will together generate $1.8 billion/yr revenues.

In this session, led by James McCormick, Principal Analyst at Forrester, we will explore the digital analytics and intelligence frameworks and practices learnt from these insights-driven businesses. We will discuss the confused landscape of digital data, analytics and experience optimization technologies including:

  • The landscape of digital intelligence capabilities and how the tech fits together
  • Major technology subcategories and their leading vendors: e.g. web analytics, mobile analytics, CDPs etc.
  • Strategies for building advanced digital intelligence practices

Join James for an exciting discussion about maximising the value of your organizational digital analytics strategy.

What does digital mean?

  • Digital includes online customer engagements and experiences; but not exclusive to it. Includes now physical world engagement at kiosks, in branches, via connected devices etc.

What is digital intelligence? Why do we need it?

  • Digital intelligence is moving beyond marketing and commerce teams to teams such as: product, services, risk management etc.
  • Some of the discussion participants stated that it is like military intelligence... guiding commanders / business leaders on tactical / strategic decision and actions
  • Automation was important.  The data to action / experience flow should be quick and real time when appropriate. It must not unnecessarily be slowed down by humans looking at dashboards and making decisions
  • Most companies were afflicted by a torrent of data but a trickle of intelligence. Thus significantly reducing the value from data

What is the business case for digital intelligence?  Why is it important?  How do we convince execs?

  • Much of the value of digital intelligence is delivered via personalization.  However, some participants felt that the time is spent reconciling data for personalization was holding the practice back
  • Many firms are jumping on the data monetization bandwagon. They are selling customer / audience engagement data to advertisers and other firms
  • Some participants felt that the ultimate value of digital intelligence practices was its potential to maximize customer lifetime value
  • Other participants had a grander vision for digital intelligence: one that delivers competitive differentiation by allowing firms to optimize customer engagements at scale; as well as allowing them to create innovative experiences and value

How do we get it right?

  • Need a solid vision and business case: while ROI description in terms of engagements uplifts (e.g. improved click throughs, conversions, revenues etc…) are important, it was felt that a business case is not complete without a top down view. This top down view must link the investment to strategic initiatives such business growth, innovation and differentiation
  • It was felt that the time to accelerate investment and efforts around digital intelligence is now. Tomorrow is too late
  • Executive ownership is a must. Who exactly owns digital intelligence depends on the maturity of the business, the market vertical and organization. But the owner MUST have authority to synchronize the practice and associated processes across the entire enterprise
  • The strategic components of a digital intelligence strategy include considerations around: data, people, technology, activation / optimization

Data-driven organisations anticipate shifting market trends and business needs. Rather than react to changes, they work proactively to optimise outcomes. Those that do not reinvent and modernise their data stacks lose customers, revenue and market share.

Led by Kevin Holler, we will discuss how to build a future-proof data architecture to keep our organisations competitive. We will look to answer the following questions:

  • What are the characteristics of a modern data architecture?
  • What steps should you take to ensure your data architecture is fit for purpose?
  • What are some key use cases for driving ROI on your investment?

Our output might include a list of positioning guidelines and advice on helping one’s company become more competitive through smart use of data.

Key highlights

  • How do we identify where to start?
    • For growing B2C companies two challenges:
      1. Cannot use GA or Mixpanel anymore
        • Expansion problem
        • GA no longer straight forward to use (if not a dedicated digital analyst)
        • Mixpanel could be a lot more powerful than GA (more flexible) but cannot easily export the data --> not great for optimisation / targeting
      2. Cannot tie Customer LTV to cost of acquisition
        • Hard to justify to investors any investment when unable to do that
    • Event tracking data can be sent to marketing / other tools in a relatively straight forward manner these days
      1. For most start-ups using the data for targeting customers is a priority
      2. Only later the need to satisfy the organisation analytics needs kick in
    • GA non customisable at all - it is "opinionated" vs. Snowplow that is totally customisable but takes a while to configure and learn
    • Tools such as Amplitude and Segment sit in between those two extremes
      • Have some customisation but could be set up quickly
      • One participant in the discussion suggested to go either all in or all out
        • Because you are already investing in customisation then no point to be in the middle as it will create challenges in the future
        • But also acknowledged that it depends on the company and resources available as clearly there is a market for those in-between tools
        • Given Segment pricing model (increase cost with increased traffic / events captured) it would become more expensive as your traffic grows
          • Would suit companies where revenue growth is tied to traffic growth
        • Very dependent on the company risk profile and human resources available (e.g. data engineers that can run a complex implementation project)
  • Who decides which analytics frameworks to use in big vs. small companies?
    • A risk-management decision
      • Big companies
        • Many stakeholders (IT, Analytics, Compliance, Marketing etc.)
        • Much more interested in mitigating risk of a major failure
      • Small companies
        • Depend on data to hit growth rates and investors' expectations so must lean towards more flexible innovative solutions
    • Most large companies earned their reputation and market share in the past when things were simpler, and data was less of a competitive edge
      • But that is increasingly changing
      • Some start-ups are moving from small to very large (e.g. Uber, Spotify etc.) quickly
      • Some large companies will disappear as they lose their market competitive advantage
  • What are the components of a modern data architecture?
    • Three characteristics:
      1. An architecture that allows pushing the data into a relational database
      2. Running collection / measurement on the event level and with data unprocessed
      3. Deploying testing environments to ensure that everything built can be tested and maintained
    • The first one is ubiquitous these days but the other two not so much
    • Not being locked into a specific solution be it GA, Adobe or other
      • There are solutions such as Fivetran that do not lock you into specific structures and allow you to retain the raw data  
    • Stay as unbiased as possible when structuring the measurement schema and only define the data in the data model  
      • As opposed to how data cubes historically work
      • This will give you flexibility further down the line if any definitions must change
    • Potentially companies such as Salesforce offering more sophisticated data solutions
      • Might explain why they bought Tableau
      • But would we ever get to a point where one supplier provides an all-encompassing solution?
        • The group concluded that it is unlikely

The customer journey has been a catchphrase in digital marketing and ecommerce for many years. Optimising the path through different customer touch points is the end goal of the effort to understand these journeys. The toolset for analysing customer journeys has always been wedged between overly simplistic views on journeys with one-way funnels and overly complicated methods such as multi-layered Sankey flows where it is extremely difficult for humans to extract actionable insight.

In this session, led by Steffen Methling of OTTO, we will explore what has worked in the past and whether there are any new methods fuelled by advances in machine learning to extract more meaningful information from user behaviour across touch points not easily fitting into classic AIDA funnels.

Come share your customer journey analytics experience and together we will emerge with tangible outcomes you could apply at your organisation following the conference.

Key highlights   

  • Before you start Customer Journey Analytics (CJA) must have a clear idea of:
    • The analysis purpose as CJA is rarely straight forward
    • The actions to be taken from your analysis and performance metrics
      • Some CJA use cases discussed included:
    • CJA driven Marketing attribution modelling
      • Micro segments for targeting marketing campaigns
      • Identifying user / customer segments for onsite targeting
  • First steps:
    • Collect the required data on interactions for all touchpoints
    • This is now often solved though not always for multi-touch
    • Several participants indicated that stitching customers / users IDs from various systems is now mostly solved
      • If not then work with what you can achieve (so long as measurement is accurate) rather than wait until you have the capability to track everything
  • GDPR and Eprivacy are a challenge for CJA
    • Stricter limitations of data usage for analysis to clear consent from user reduce the amount of available data
      • Not necessarily a major issue since digital analytics data has always been incomplete (ad block etc.) and should be considered as a survey with a very large participation set
    • Good consent management is key to convincing users to opt in to share their data to make the user experience better for everyone
  • Some best practice ideas
    • Conduct small POCs for journey-based micro segments and campaign targeting, to see what works -> e.g. provide activity ideas to customers based on articles they look at, then scale and automate your logic
    • Use a mix of deep learning algorithmic unstructured touchpoint data, like sentiments in customer feedback and combine it with business rules
      • E.g. send a discount coupon to customers that become unhappy with the product
    • Use customer segments like brand loyal customers or more generic buyers and compare their journeys e.g. look at keywords they use in search
    • Use a holdout group in marketing or targeting to measure differences
      • This holdout group should have a clear place in the budget as lost revenue
    • Use journey lookalikes for targeting

CJA Canvas – a cheat sheet template
The group put together the key discussion points into a template (referred to as the canvas) for others to use:

  • The Canvas should be the starting point for CJA projects
    • Should be completed in a workshop with the relevant stakeholders
  • The purpose of the Canvas is to lead to a common high-level understanding of the purpose of the project and expected outcomes
    • Consequently, the Canvas should not be overly detailed
    • Something non-analyst can review and immediately understand
  • The key points in each box should be discussed in small groups and filled in together during this workshop (analysts, data people and stakeholders like users and touchpoint owners)
  • All stakeholders involved in the CJA project should meet regularly to review progress and validate that the Canvas is still relevant During the project, the team can get back together and look at the canvas, to see if it is still valid
  • Templates
    • See image below
    • Download the populated pdf version here
    • Download the empty pdf version here
Financial Times

Global privacy regulations are evolving with a myriad of different rules in different jurisdictions. Further evolutions of browser privacy settings - first with the rise of ITP and soon with the demise of 3rd party cookies – are fundamentally changing the digital marketing and analytics landscape. Management is looking to us to provide clarity and more importantly – solutions.

Join Tom Betts, Chief Data Officer at the Financial Times to discuss:

  • The current and upcoming changes in regulation
  • Are regulators or big tech making the largest impact?
  • What changes are required both in terms of tech and processes?
  • How you can get ahead by preparing for these changes?

Come away from this discussions with a clear understanding of the regulatory threats and opportunities for your analytics practice and with practical options for the most pressing challenges.

Allison Ma

Data literacy can lead to a conceptual transformation within a business. The challenge is to get various stakeholders to either buy into data literacy or put away past failed attempts and trust you to deliver the desired transformational outcome this time round.

In this discussion, led by Head of Data Science at Wagestream, Allison Ma, we will tackle this challenge head on and look to answer the following questions:

  • What data literacy challenges do you face within your org?
  • How do you convince different teams to accept your help with specific goals?
  • Should our approach differ based on stakeholders’ seniority level?
  • How do we handle (inevitable) push back productively?
  • How do we improve an already data-driven team and help them level up?

During the discussion we will seek to expose as many use cases as possible so you can go back to the office with tangible concepts applicable to your specific circumstances.

Common challenges

  • If already in data literate org, then how to evolve and bridge the last mile gap
    • How to educate people without undermining or questioning leadership
    • Helping senior members who may be set in old, non-data-driven ways
    • How to act on real time data

How do you work with teams that are not so data literate?

  • General challenges
    • Not stepping on toes or egos is a big struggle here
    • Stakeholders in non-data driven teams may feel threatened, suddenly too accountable, or like they need to catch up
  • Solutions
    • Must build bridge between their creative thinking and our analytical thinking
    • Be a “teacher not a preacher”
    • A few people had good success sitting down with stakeholders directly
    • Some people have had to be more subtle and gently guide them to a data driven path over time
    • Framing the bridge gap between creative and analytical thinking helps

How do you build a scalable solution? Is it top down (C-level then trickles down) or bottom up (stakeholder level then builds up)?

  • Solutions
    • Educating decision makers
      • Decision makers need to know how to ask questions to data people
    • People in org should speak the same data language
      • Bridge gaps among marketing, product, and pricing teams, and especially those who lack experience
    • Be visible
      • Helping the right people
      • Reminding people that the data team exists (more below)
    • Build trust within orgs or between you and stakeholder
      • Ensure data is correct, show confidence in data and understanding
  • Top down approach argument
    • Top management must support data driven culture
    • Need data ambassadors within teams
    • C level training (secret training) on data, being careful not to undermine someone’s leadership skills
    • New starter classes on data culture are helpful, as well mast master classes and intro to data courses

Now that your organization is data literate, how do you “level up”?

  • Challenges for data literate org
    • Too many requests
    • People are obsessed with new insights and trendy data models and terms
    • Scaling data governance
  • Solutions
    • Put data people in meetings and decision-making processes
    • Labelling requests as level 1 - 3 solution (3 being more advanced and more time consuming) - you will find most people want Level 1 solutions
    • Verified dashboards for data governance

Does centralizing or decentralizing data teams help with data literacy?

  • Decentralization: Distribution of company matters
    • International companies may need data members in every country for cultural and data expertise purposes
  • Centralization is helpful for data governance

Creative solutions for improving data literacy

  • Remind stakeholders that you are there
    • Someone set aside time every month to read industry reports and pick out charts to send to specific stakeholders
  • Remind everyone that you are there
    • Someone puts together a ‘new article’ on insights, explainers, and KPIs
    • Someone shared an example of another company doing something similar where data team put together their own magazine with insights and have once a month data meetups
  • Listen to what people want
    • Make people your friends and guide them
    • Do not sound too smart
  • Use infographics when possible
    • Eye catching
    • Easy way to get attention and get your work opened
DHL Group

Growing your role into an analytics service advisor requires a change of behaviour on your part to affect a change in the behaviour of your organisation. How we represent ourselves determines how the rest of the enterprise sees us.

Leading the discussion form the perspective of an internal analytics service advisor, Till Büttner will look to unearth ways analysts can present their occupation so that the rest of the company sees them as specialists and authorities instead of reporting monkey.

What does an analytical statement of work look like? How does an underfunded analytics team properly prioritise the tasks they can accomplish? What can we do and what should we avoid doing to gain respect?

Our output might include a list of positioning guidelines for analysts or advice on helping one’s company become more data driven.

Key Highlights

  • Set your own career goals:
    • How much are you willing to sacrifice to develop yourself professionally?
    • Is analytics your long-term ambition?
    • What are your roots / professional background?
    • What are your strengths? How can you improve on them?
    • What are your weaknesses? How can you overcome them?
  • Automate tasks which can be automated. Learn to:
    • Identify jobs which a software can do for you
    • Programming languages such as Python, R or SQL to write your own automation software
  • Understand your stakeholder and their requirements; particularly, their day to day pains
  • Educate yourself:
    • To actively listen and mirror what you have heard
    • To adopt your stakeholders' language
    • On "Design Thinking" and how that applies specifically to analytics
  • Present insights without being asked for:
    • Be proactive rather than reactive
    • Overcome corporate behaviours to deliver value through analytics (i.e. be pushy; do not worry about traditions – “how things work here” and prove data can help!)  
    • Explore and identify the kind of data that could help stakeholders make better decisions
  • Work effectively
    • Look at methods such as Kanban, Scrum etc. particularly in the context of analytics teams' work processes
    • Prioritize your daily tasks by value
      • First In First Out is rarely the right model
      • Prioritization must not be solely based on monetary value:
        • In the short-term you might want to prioritize work that could help promote analytics in the org
        • Aim to create a transparent prioritization model that stakeholders can see and understand
  • Get help
    • Find a mentor
      • Does not have to be an analytics professional or from your own field
    • Find business partner to understand the business from your "internal customers"
    • Ask colleagues who have skills which you do not have to help train you
    • Invest in self-education (plenty of free or cheap options these days)
      • Areas to look at:
        • Statistics
        • Analysis methodology
        • Programming (R, Python, SQL)
        • Visualisation
        • Presentation (both visual and oral)
        • Project management
  • Use tactics such as:
    • Create your personal branding
      • Will help both with your own positioning and analytics in the org
    • Tiki-Taka: pass balls between analysts and other business experts to generate insights and become visible
    • "Know really what is your question” questions help start the right conversation
      • Often stakeholders’ do not articulate the right questions
      • You might not know what they are looking for either --> a “guided exploration” is required to reach the necessary questions analytics can help with
  • Look to develop a self-service environment. Servers three purposes:
    • Reduces analysts time spent on trivial questions
    • Increases stakeholder’s competency and comfort with data
    • Will lead to data maturity --> stakeholders will ask your help on more intelligent and sophisticated data questions
  • Start with people who trust you to go faster forward
  • Avoid going into technical details
    • Stakeholders do not care and will lose interest working with you
  • lways create context:
    • Easier to follow you with context (“why should I care?”)
    • Many stakeholders just have isolated knowledge
  • Do not forgot your business roots:
    • Your business history should serve as a backbone you can lean on
    • Keep your skills fresh – get to the point where you can use them “at night while sleeping”
    • Be humble – value the experience you had of being a reporting monkey and from time to time do some of those jobs
      • Will help you deliver better outcomes for your stakeholders, colleagues and analysts working for you

Be yourself and be patient, this journey takes time and it is time consuming but it is worth the investment.

11:45 Virtual Coffee Break

12:00 Gold Keynote

We have all seen presentations about tech tools change to „powered by ML / AI“ in the last few years with promises of full automation and intelligent decisions made by the underlying „engine“.

But let‘s face it, upon deeper inspection, most of it turned out to be just hot air. In this short presentation we will look into some of the more „creative“ usages of not-machine learning in marketing and digital analytics, the prerequisites for actual application and some examples.

12:10 Round Table Discussions - Round #6

Each delegate selects and participates in one discussion from below

Round #6
Wed., June 17
12:10 - 13:30 CET

Sid Shah
Conde Nast

Businesses are increasing spend on programs to help improve customer retention and loyalty. Data is critical in driving success to personalized customer experiences and targeted campaigns. To do so you must get your analytics strategy right and data structures in place to support such programs and help reduce churn.

In this huddle, led by Sid Shah, Head of Analytics & Insight at Conde Nast International, we will discuss:

  • How to leverage existing data to measure customer loyalty and understand their behaviour?
  • Which tools, techniques and analytics strategies should you use to support uplift in customer retention?
  • How customer data can be used to build and optimize great products?

Share and explore different analytics practices that will help us all build or improve our customer retention and loyalty programs.

Challenges faced in measuring loyalty include:

  • Defining when an individual can be considered a customer
  • Calculating the customer lifetime value
  • Measuring and forecasting retention
  • Many businesses currently measure loyalty through behavioural actions and engagement scores; however, more worthwhile but difficult endeavours would be to:
    • Predict future customer loyalty to reduce churn
    • Balance acquisition against retention
    • Allocate resources to the most deserving customer segments
    • In post-COVID times, as offline outdoor activities resume, retaining newly acquired loyal customers online will pose new challenges

Other interesting takeaways were:

  • Keep track of your rising acquisition costs. Acquisition costs should comprise one-third of the customer lifetime revenue
    • Besides defining metrics that relate to the health of your business, you need to know the level of investment required to sustain loyalty and allocate a proportion of your costs accordingly
  • The USP concept is getting weaker and weaker in the crowded marketplace, as businesses realize the importance of ESP i.e. emotional aspects of the customer experience and journey.
  • NPS measurement can be flawed owing to its single time point approach. However, if used across different stages of the customer journey, to measure experiences (e.g. “do you feel safe in the airport?”, “do you know where you are going?”), NPS values can be meaningful

To achieve your vision of what analytics can do for the organisation you need a team. Your analysts are your front line – their work reflects on you. They are talking with the business, interpreting data, crafting presentations. They need to understand the customer point of view, to develop the consulting skills to understand and serve the business, and the ability to not only interpret the data but put it together into a compelling story. They need the exposure to develop their careers.

That is a tall order for one job description – consultant, researcher, data expert, visualiser, storyteller. For Sandra Kampmann, Head of Analytics & Insight at ASOS, this is a daily challenge managing her team. In this session, she will lead a discussion on how to define development opportunities for analysts, help them hone their skills and keep them motivated.

Join us for a discussion that will offer tangible benefits to both team managers and senior analysts.

Key highlights

  • Importance of setting the right expectations and feeding development progress back
    • Useful tool – competency framework
  • Expectations are across technical skills, drawing the right conclusions from data and soft skills
  • How to ensure the right mix of personalities and cultures in the team
    • Requirements listed in CVs
    • The right job titles to attract talent
    • Tailored hiring process to appeal to right candidates
  • Tools to enable self-development
    • Stand ups
    • Master classes documentation
    • Workshops
    • Analysts involved in data process alongside analysis
    • Encourage analysts to teach each other
  • Coaching and developing managers
    • Expectation setting is very important
    • Mapping out career paths and progression (manager path vs. contributor)
    • Caring about personal aspects and work life balance

One of the most powerful skills an analyst should look to develop is the ability to influence stakeholders. Generating relevant and accurate insight is a must. A simple to understand presentation too. Yet, those on their own are rarely sufficient.

In this discussion, Yael Farkas, Team Lead Digital Intelligence, Parfümerie Douglas, will challenges to come up with answers to the following questions:

  • How can you activate your analysis (how to make stakeholders act or at least test you conclusions)?
  • How to create momentum for your analysis?
  • Where should you be explorative and where to be a sparring partner?
  • Why are we still talking about this challenge in 2020? Could it be the lack of boldness and persistence that holds us back?

Being an influencer is a key goal for any analyst. Come share your experiences and together we will outline practical solutions to help you achieve that goal.

We started out with our current attempts to distribute our analysis results and some of the ups and downs of it. Quickly moving to tactics to overcome some of the challenges and get our findings across to create value for the company.

  • If stakeholders do not use our insights; who are our 'competitors'?
    • Marketing Agencies & Cognitive biases
    • Hidden agendas
    • Time of executives
    • Budgets
    • Sometimes an "We always did it like this. Why should we change it?"


  • Communication gap: Stakeholder see delivered insights as just data, where the analysts see insights
    • Solution: Tell them what you see. Even if it is stating the obvious
  • Always have a management summary with max. 4 bullet points, else people will not read it.
  • Storytelling is important - words over numbers
  • Lots of insights sharing seems to be happening via Slides. Here are the top learnings that were mentioned:
    • Use McKinsey Titles --> Key statement is always on the title.
      • Also helps to prime the recipient for the perception of the presentation
      • Getting the title right might take a lot of time but is worth it
    • Beware of the font size -> if it is too small and has too much information, no one will read it
      • Especially true for C-Level
    • Think about the order of your slides -> sometimes people will stop after the first one
    • Have a template
  • Challenge: find the right balance between time & focus in important steps (analysis, storytelling, engaging people)
  • 1:1s seem to be working better than weekly send outs, but might be a challenge in larger company -> divide and conquer approach, focus on those who are open and willing to use the insights and trust that others will follow once they see the results
  • Successful approach of one of the participants: Once per month take one day to look at data and share within one meeting
  • Go through small A/B tests and see what the reaction is
  • Use elevator pitch (can be done via mail as well)
    • Think about who you are pitching with and how you need to communicate
  • Think of your insights as a product und behave like a salesperson
    • Do marketing
    • Speak the language of your stakeholder
    • Figure out what key drivers, pain points, etc. are
    • Help them shine
    • Once the relationship is established, do not forget to nurture it

Why are we still talking about this in 2020?

  • Data & tools did evolve but data literacy and storytelling did not
  • Chief Data Officer or better Chief Insights Officer is missing
  • We need to be influencers for others, esp. upper level management
Bourne Leisure

These days just about anything in digital can be tracked. “Hooray!” shout your stakeholders, not truly understanding both sides of the coin – the blessing and the curse. Positioning this vast array of data points to make sense and be of genuine benefit to stakeholders is a tricky task for any analytics team. Enter KPI frameworks!

Led by Dan Grainger of Bourne Leisure, this discussion will address the following points:

  • What does an effective KPI framework look like?
  • Where to start? Who should be involved? Who owns it?
  • Are your KPIs actionable? Should you follow SMART or other methodology?
  • How do you evolve your KPI framework over time but keep consistency?
  • How do we make our KPI framework admitted across the organisation?

This discussion will give you the impetus to identify, set and use targeted KPIs to achieve better outcomes for your business and customers.

What are KPI frameworks/ what are they for?

  • A measurement for improvement
    • KPIs must relate to all the objectives
    • Combination of high level and drill down
  • To make actions comparable
    •  Need to focus on three to five
    • Ability to make changes during the life of an activity (e.g. marketing campaign)
  • Include “North Star” metrics…but more is needed
    • Useful….but they can oversimplify matters
    • They give the “what”, but not enough depth for the “how”
  • The framework becomes a segmented “pyramid” of objectives and KPIs
    • Goes hand-in-hand with the creation of a data self-service / self-sufficiency culture

What is the people aspect of KPIs?

  • Universal Segmentation
    • Coming up with common characteristics
    • Work with various stakeholders to create segments and base your KPIs on top of that
    • But runs the risk of creating two-tiers of KPIs
      • As the KPIs try to accommodate all departments and stakeholders they become less relevant to each department and stakeholder
      • So managers have to use the universal KPIs for top level reporting but another set of KPIs for their own department to monitor success
  • Sometimes there are conflicting needs and objectives within the company
    • The bigger the business, the harder it gets to have a proper KPI framework
    • Might suggest a broader problem in the organisation --> why are different departments have potentially conflicting objectives or driving in different directions
    • Personality plays a major role --> how managers manage their teams/ departments
  • Some stakeholders do not even know the objective of their website (e.g. in the automotive industry some do not know the purpose of the website)
  • Alignment is critical for setting an effective KPI framework
  • Always be considerate of personalities and adjust your stakeholder management style accordingly when creating these frameworks
    • Some stakeholders / leaders are more “closed” than others – they may be insistent on specific metrics that they “have always used”, which doesn’t necessarily mean they’re the right ones

Training and goal setting methodologies prove useful

  • Consider taking product owners (for example) through OKR training so they value the need to relate objectives to results
    • Semi-annual or even quarterly setting of objectives
    • Product owners will then sit with their teams to set the secondary and tertiary objectives/ KPIs
    • The larger the team the harder this objective setting process is
    • Data literacy clearly important and should be part of the education/ training
    • For each objective there may only be three outcomes
    • Tertiary objectives/ KPIs are named “health check” objectives --> used to monitor trends but not considered an objective that one could potentially influence
    • All this together provides ACCOUNTABILITY
  • Who's the KPIs owner?
    • Not easy to answer --> accountability sits with the product owners but analysts own them on the day to day
  • Which methodologies can be used?

How to get stakeholders to adopt your KPI framework?

  • Must align with the business objectives otherwise very unlikely to be able to integrate it effectively
  • Also alignment with stakeholders’ personal objectives
  • If no alignment then very hard to impose + also tough to do when you're an external consultant
  • This is NOT easy
    • Even harder if you are working on the agency / consultancy side where you are an “outsider” to the business!
    • Tip: find someone within your client’s business to become a champion for you internally

Automated Machine Learning (AutoML) enables data engineers with limited ML expertise to train high-quality models specific to their business needs. As such it offers a similar transformation to that experienced through data democratisation – analysts should be able to use ML without data engineering skills. How are you taking advantage of AutoML?

In this discussion, Ian Thomas will lead us through the following questions:

  • What are the use cases for AutoML in analytics?
  • How do analysts build their skills and capabilities with AutoML?
  • Who are the leading AutoML platforms and how to apply them into your day to day work?

Come share your experiences and learn how others are making use of AutoML to progress their analytics output.

How organisations are embracing ML (and AutoML)

  • Most participants' organisations are still at an early stage with ML
  • ML provides a natural bridge for Analysts who are being asked to predict what will happen, rather than simply describe what has happened
  • Analysts can play a key role in bringing the benefits of ML to the wider org - can bridge the gap between stakeholder outcomes and the data and models (McKinsey calls this the "Analytics Translator" role)

Getting started with ML

  • Contrary to popular belief, you don't need a huge amount of data to start with ML, but high-quality data is essential
  • Common "first step" ML use cases are churn prediction, conversion prediction (propensity models), and lifetime value prediction for new customers
  • Anomaly detection is another good initial use case because it focuses the business on quality

ML and the organisation

  • ML needs analytics/data teams and engineering teams to work together, which is not always easy
  • Many data scientists are not very interested in operationalising their ML models - AutoML has a valuable role to play here
  • The ease of use of AutoML tools means that analysts with little formal DS training can create models, but the idea of a "Citizen Data Scientist" (that has no formal analytics training) is a fanciful one
  • Some data scientists are a little threatened by the democratisation of data science, but end up realising that AutoML can help them become more productive and do less boring work


  • There are many examples of ML models gone awry - guardrails are important
  • Transparency and accountability is key - AutoML cannot create "black box" models
  • The role of ethics in the deployment of ML is essential
  • Apparently "unbiased" ML models can easily reflect the biases of the people who created them, or the data with which they were trained

Most organizations have a business strategy, a product strategy, a marketing strategy... But do you know one that has a data strategy?

A data strategy's purpose is to set the foundation for effectively leveraging an organization's data and turn it into value. Hence, a good data strategy does not just entail technology and tools, but the whole value chain of data within the context of business objectives and priorities. The tricky part is making this concrete, defining the components, deriving initiatives, use cases, and setting priorities and responsibilities.

In this discussion, moderated by Marc Preusche, marketing data (strategy) nerd at Dept Data & Intelligence (formerly known as LEROI), we will collectively answer the following questions:

  • What are the components of a successful data strategy?
  • What are the key challenges when setting up a data strategy? What are drivers of success?
  • Who should be part of creating the data strategy? What does it mean for the analytics professionals?
  • What are the short-term actions ("quick wins") and what should be tackled more long-term (like data culture)?

Let's make this discussion as concrete as possible to help you make the most of your organisation's data.

Data strategy components (to be successful)

  • Relation to business strategy
    • Data strategy needs to relate to overall business strategy
    • Data strategy needs to help achieve the overall business strategy (e.g. to make the right decision or increase performance/efficiency)
    • Consider the four Ps: People, Productivity, Products, Purpose
  • Long-term plan (objectives)
    • Data Strategy is a strategy itself with a long-term view
    • A business strategy might change over time which should not jeopardise the data strategy
    • Data strategy needs to be resilient
    • Two-fold
      • Long-term strategy (big investment) to make data an integral part of the company
      • Short-term strategy (smaller investment) to serve short/mid-term business goals
  • Goals & Objectives
    • Use Cases: what do I want to do once I have data?
    • ROI Focus: defined goals and use cases including clear metrics to proof the success of the ambitions
    • Measurability: ensure that goals can be measured (correctly)
  • Organisation (People & Culture)
    • Skills: do we have the right skills in the company? How do we upskill employees?
    • Frameworks: are we dealing with data correctly and effectively? Do we have the right desire to work with data?
    • Ownership: who is responsible for the data? Who delivers what data do whom?
    • Roles:
      • What roles do we need and what skills do they have to have to achieve the strategy?
      • How can these roles be hired or developed?
      • Do we need data stewards?
    • Team setup: Centralisation vs. decentralisation
      • Hub & Spoke vs. central team with clear RACI-model
  • Processes:
    • Processes: Do we have the necessary processes in place to achieve the data strategy (including access, quality, logics…)?
    • Responsibility/Accountability:
      • Who does what?
      • Who is accountable for what?
    • Request Management: define how requests are submitted, prioritised and processed
  • Technology
    • Stack: do we have the right tools?
    • Scalability: can we scale?
    • Integration with IT: have sufficient freedom but also integrate with IT strategy and stack
  • Data Governance:
    • Data Quality/Correctness: ss the data still ok? Is it usable?
    • Data Accessibility: Who has access to what data and when? —> need-to-know principle
    • Data availability:
      • Do we have the right data?
      • What other data we need?
      • Is the data actionable?
    • Data security:
      • Is the data safe?
      • Is the accessibly only for people that need it?
    • Data compliance/privacy:
      • Are we compliant with local and international laws?
      • Is what we do or want to do right and allowed?

Steps to get to a data strategy

  • First Step:
    • Started with internal PR by talking to everyone for two months to get a very rounded view of what is important to stakeholders
    • Start with interview and talking to all core stakeholders to derive objectives, define and prioritise them
  • Second Step:
    • Get the right people to the table to be able to realise the plan
    • Defined the value for everyone involved and ensure it is achievable and measurable

13:35 Sponsor Demo Rooms

13:50 Summary & Prizes

14:00 Happy Hour

14:30 Workshop

Wed, June 16
14:30 - 16:30 CET
IIH Nordic

The nature and application of digital data is changing. Organisations are constantly looking to improve data utilisation. New analytical tools and data flows are emerging. Management increasing interest in data and awareness of its importance is, in turn, changing the purpose of analytics.

What do these changes mean for the data / digital analyst? Are your current analytical skills and methodologies going to serve you well tomorrow? What would you need to deliver to enable organisational success and be seen as a driving force?

Steen Rasmussen, Founder and Director of Analytics at IIH Nordic, will guide you through a framework for your analytics future. In this interactive workshop, he will help define, answer and plan some of the key activities needed to adapt current analytics approaches to the future state of competitiveness on data.

We will look at the increased automation of the analyst’s work assignments to free time for higher value exploratory tasks as well as enable direct data activation and orchestration. Based on the concept of profit driven analytics, we will explore the scenarios for increasing the relevance, significance and impact of digital data on business operations.

Steen will also be building on insights from IIH Nordic’s full digital transformation which, as part of a new vision for analytics work, included moving all staff to a 30-hour, 4-day working week without loss of earnings or productivity.

You will come away from this workshop with newly acquired skills and a clear vision of how to improve your analytical output.

16:30 End of Day & Conference