Virtual DA Hub Copenhagen - Discussion Summaries

First Discussion Round

Andrea Mestriner
YOOX NET A PORTER

A recurrent theme across the analytics community is finding the right mix of people, process and technology. There is no one structure that fits all yet most organisations struggled to strike a good balance. This challenge is augmented for multi-nationals – how to balance regional requests for reporting and analysis with a globally-managed platform?

As the Global Head of Analytics for luxury fashion etailor YOOX NET A PORTER, Andrea Mestriner is constantly looking to get that balance right. In this discussion he will explore with you the following questions:

  • How do you identify the problems worth solving?
  • How do you identify the right people, processes and technology mix?
  • How do you engage stakeholders in the analysis process?
  • How do you collaboratively create/share best practices across disparate team in multiple locations?
  • How much freedom do regional teams should have?

You will come away from this discussion with a set of insights and ideas for moving your analytics programme forward.

Steen Rasmussen
IIH Nordic

As the saying goes, the only thing that is constant is change. If digital analytics implementation projects where once perceived as ‘one-offs’, these days savvy organisations understand that they cannot afford standing still with their analytics implementations.

Even the most sophisticated analytics implementation must adjust to changing business requirements, a growing appetite for customer journey data and shifts in technology towards cloud and machine learning.

In this session, Steen Rasmussen of IIH Nordic will explore how to evolve your implementation beyond the standard into the realm of deeper insight and activation. We will discuss how implementations should help drive action and real-time targeting to deliver return.

Topics will include:

  • Measuring price sensitivity through product data and micro conversions
  • Actively using forecasting to enhance the actionability of data
  • Optimizing for customer journeys and CLV through decision engines and cloud
  • Enriching and ensuring the right variables are captured
  • Integrating cloud and machine learning to enhance Return on Analytics with machine learning

Bring your advanced analytics implementation examples and hear what other leading organisations are doing to keep their analytics framework fresh.

  • Focus on business value, not data. The key role of analytics is supporting business decision, if we don’t do that then we are a bad investment
  • Rather have some data supporting just one objective than a lot of data supporting nothing.
  • We have tried for year to gather more data so we could help answer peoples questions, but people do not have good questions – teach them to ask better questions
  • To maximize return on analytics we need to stop talking analytics and start talking business
  • Automate what can be automated

Heidi Teschemacher
Life Extensions Europe

Many organisations, including digital ones, are still heavily focused on their brand and are struggling to transform into a truly customer and data centric entities. Today’s commercial reality is one that unless data is captured, shared and utilised, any increased adoption of digital technology will be in vain. A competitor with the better data will most likely win in the end.

Life Extensions Europe (LEE), the supplements retailer, has made this leap. In this discussion, Heidi Teschemacher, Commercial Director Marketing & eCommerce at LEE, will lead a discussion about how to turn product to data and data to profit. Questions we will answer include:

  • How to remove professional silos in teams and nurture hybrids that understand both the brand and the customer?
  • What transformation in KPIs is required moving from brand to customer centricity?
  • Why being agile and fast to market plays a key role in this transformation process?
  • What are the required leadership skills taking your team through this process?

You will come away from this session with fresh ideas of how to improve your analytics output and your value to your organisation and gain insight into how a customer and data centric business looks like.

  • Silos are difficult to break down and maintain. Therefore, the focus should be on having ONE GOAL
    • Several participants gave practical examples
    • One mentioned how they, in a growth organisation, have monthly meets on how they, on both a personal and team level, perform on growth thinking, behaviour, attitude and deliverables to that ONE GOAL for all
  • The roadmap for transition was discussed
    • The road turns symbolize constant changes, agility and casualties (saying goodbye to people) as skills, mindset and willingness to transform are important but not always possible to achieve
  • The discussion then turned onto brand level versus data, with the group sharing a few examples
  • Regarding leadership – a few mentioned the importance of top management having to engage and understand the change of skills and expertise
  • The discussion concluded with a flurry of detailed questions. There was time to answer some like how we used the data
  • The supporting slides used during the discussion were shared with participants for further review

Frédéric Serval
LEGO Group

Data democratisation aims to provide non-analytics stakeholders greater access to the organisation’s data so they can make better and faster decisions. However, data isn’t universally understood – stakeholders must acquire a certain of level of data knowledge to interpret it. Used incorrectly, data interpretation could lead to suboptimal decisions.

So is data democratisation something to suit every organisation? In this discussion, Frédéric Serval of LEGO Group, will invite delegates to share experiences and opinions on their data democratisation efforts. We will cover the following points:

  • Is data democratisation always a good thing? If so, why is it so important?
  • How to turn the organisational mindset towards greater data accessibility? Who should have access?
  • How do we get non-analysts comfortable manipulating data? How do we control quality?
  • What makes a successful self-service analytics program?
  • How should your reporting and visualization strategy look like?

This discussion is designed to give you a view of how other businesses are tackling data democratisation and provide some tangible ideas you could apply to your own organisation.

Summary was too long to add here and is available as a Google Sheet here: https://docs.google.com/document/d/1s7i9npqxu_b5RWb3AO5yCRNaw_gAUurmAby0VnFWAtg/mobilebasic

Martin Madsen
nemlig.com

Global privacy regulations are evolving and further evolutions of browser privacy settings - first with the rise of ITP and soon with the demise of 3rd party cookies – are fundamentally changing the digital marketing and analytics landscape. Management is looking to us to provide clarity and more importantly – solutions.

And so server-side tracking is back in fashion. But not without its challenges. nemlig.com are in the process of implementing server-side tracking. In this discussion, led by Martin Madsen, Senior Data Engineer, we will look to answer the following questions:

  • How does server-side tracking work in a cookie-less world?
  • What are the key technical challenges and how do we overcome them (e.g. tag management, tracking logins etc.)?
  • Server-side tracking requires a much closer collaboration with IT – what are the organisational challenges we face when commercial and IT teams come together?
  • Which cloud infrastructure to use and who should own the platform post implementation?

If you are considering server-side or in the midst of implementing a server-side solution then this discussion is for you. No cookies served!

  • The discussion started with a conversation about cookies, ITP and 1st vs. 3rd party cookies
  • It then naturally progressed into the pros and cons of server vs client-side tracking
    • We concluded that server-side can’t take over the complete setup of the current client-side setup – there is a different use case for each method
    • We also concluded that server-side can take over a lot of the current event tracking setup, but we still need a 1st party cookie to attribute the events to an actual computer or device
  • The next discussion point was about Google Analytics and Adobe Analytics – their historical and current purpose. We asked whether it is going to be the same in the future
    • We concluded that the purpose is more or less the same and that it will still be tied to digital marketing (paid media) in the future
  • We also touched upon tech and people and who can drive server-side tracking implementation?
    • We concluded that IT are the most likely team to lead it
    • We noted that data engineers and data scientists at a commercial department who could also lead the implementation in collaboration with the IT department
  • We finished the conversation with participants sharing various server-side implementation use cases and best practices

Marina Medved
Saxo Bank

While analysts once operated in isolation as part of a marketing or product team, in progressive environments they are now working more closely with data scientists and data engineers who have entered the scene along with large and diverse data sets. Undoubtedly, a welcomed development for analysts but it also presents some challenges.

How does this shift change the way that analysts frame their skills and deliverables? How do different data specialists collaborate? What are the career pathways from one type of role to another?

Marina Medved is the Head of Martech & Analytics at Saxo Bank. Among her responsibilities, she is building a team of analysts who are working alongside data scientists and data engineers to deliver a well-rounded set of analytical products and services to business partners throughout the company.

Attend this session if you see your role as analyst changing and are looking to understand what this change means to you.

During the introduction it became apparent that most participants are curious to know "what's next" for digital analytics and what skillset will be in demands in the coming years. Another area of interest was about how to pull together a team of required specialists but without blowing the budget. 

The conversation was dynamic and evolved beyond the proposed topics. A few very interesting (and some disruptive) concepts were put on the table for the discussion. Among them: 

  • Does the role of a digital analyst imply analysis or is in its essence the use of reporting interfaces?
  • Will digital analytics continue to exist as we know it or will it likely seize as a function in a few years’ time?
  • What skills to acquire to remain competitive in the job market? will hard or soft skills be in the highest demand? what path will be best to pursue - a path of specialist or generalist? 
  • Will businesses hire their own specialist teams, ,hhhhor will those roles be outsourced?

The options divided and the participants put forward various arguments which allowed to view the situation from different angles.

The majority, however, agreed that:

  • The ability to hear needs of other teams, translate and narrate the context in the right jargon to various business and technical stakeholders are the core value of the analyst role (the analytics translator)
  • In the near future, soft and generalist skills will be more important and harder to get by as they can be honed only through practice 
  • To function successfully, it is best to focus on opportunities rather than defects; when communicating value of your work, try to emphasize what business could have rather what they do currently not have

Second Discussion Round

Dan Grainger
Bourne Leisure

Jim Henson’s famous blue monster achieved his iconic global status through a voracious appetite for cookies; there was just never enough of the gooey choc-chip goodness to satisfy the creature. Following the legislation of GDPR, we too find ourselves facing another cookie monster – the consent management platform!

Whether OneTrust, Commanders Act, Quantcast, Cookiebot or other tool, these platforms require setting up in a fashion that maintains customers’ privacy without destroying your capability to track and market to them within the bounds of the law.

Bourne Leisure are going through this process now. Dan Grainger, Analytics Manager at Bourne Leisure, will lead you through the ins-and-outs of the process and considerations that need to be accounted for including:

What you should consider prior to implementation? Who is responsible for implementation and who should be involved? Where should consent management live post implementation? How do you socialise the (potential) business impact? What are the “gotchas” and grey areas?

Om-nom-nom indeed. If consent management is in your remit then join Dan for a stimulating about how to do it right.

Note: we will not cover vendor selection in this discussion as the focus will mainly be on the management and data aspects of the challenge.

  • The session was about the process rather than a vendor. Caveat – we’re not legal experts, any advice/ experiences discussed should be checked with your own business’ legal team!
  • Planning – Up to 50% of the project, absolutely critical
  • Who’s involved? Stakeholders…
    • Directors
    • Compliance/ legal
    • Development/ IT
    • Marketing
    • UX/ design
  • Prevalence of opting in and opting out should really be equal
  • One participant commented that the choice is between 100% business or 100% compliance
  • Realistically we have to find the right balance within the bounds of the rules
  • Ownership is always clear as there are many stakeholders. The final say should really be for one person/department – the marketing director, compliance etc. – with advice/reasoning from others
  • Depending on what are the implications of non-compliance? How high should the decision be made?
    • The possible outcome of non-compliance with consent rules (how big the fine is, the brand image damage, the sales drop due to a controversy etc.) dictates who has the final say on decision (finance, marketing, legal etc.) and at what level of the organisation (CEO, Head of, etc.)
  • Compliance is driven by public opinion / partnership with private partners (FB / Google etc.)
  • Brand perception damage is a penalty in its own right
  • The law is designed to protect EU citizens, so external entities that interact with EU can also be fined
  • Are companies actually complying?
    • Increasing but few are completely compliant to the letter of the law
    • Could change if lawmakers decide to make an example of someone!
  • Many providers run a scrape of the site to classify cookies
  • “Strictly Necessary”…if you don’t use the cookie something will break!
  • There are existing models, but classifications can be discussed with stakeholders
  • Grey areas
    • Personalisation tool was classified as strictly as it was essential to user experience
    • Tracking from partners (e.g. AWIN network)…no cookies breaks the customer discount promise as unable to validate / pay commissions
      • *Could* classify as strictly necessary but this will obviously need to go through your compliance team
      • Could work with partners to get them to explain to end users they need to accept consent – though this is *technically* not to the letter of the law currently (again, discuss with compliance!) as you are not allowed to withhold capability
  • TCF standard? Not sure what is that… TCF 2.0
    • Build & QA: 40-45% of the project
    • Deployment: The remainder of the project!
    • Tips/ Gotchas!
      • Take the opportunity to clean your tag manager containers first
      • Julius Fedorovicius has written a great article on implementation (https://www.analyticsmania.com/post/gdpr-cookie-consent-notification-with-google-tag-manager/)
      • Watch out for any hardcoded tags on sites…should be in your tag manager when deploying cookie management platforms via said tag manager
      • Watch out for dependent tags (i.e. those that require another tag to fire first); both tags will need to be in the same classification or you risk the second tag failing
      • Campaign tracking…your solution should account for the fact that these parameters are generally only available on the landing page…one option is to ensure the user has to make their cookie options before they are able to engage with the site / move to next page….an article on Simo’s blog has another option (https://www.simoahava.com/analytics/persist-campaign-data-landing-page/)
      • When tool launched, you’re likely to see a fall in volumes (in GA, Adobe, etc. Consider putting back-end transactions (for whatever your “transaction” is) on your dashboards next to your digital data transactions

Sigi Bessesen
HSBC Bank

We strive to build a strong analytics/optimisation capability to support our stakeholders and impact business change. In most cases the desire is to build an in-house team which would give us more control at a more affordable cost.

However, challenges lay ahead. How do we get the right people? What is the right balance? Can we get management buy-in to approve budget? How do we ramp up our programme quickly enough to show return on investment? These are testing challenges, some that could be eased or relieved by bringing in subject matter experts to augment our in-house competencies.

Sigi Bessesen has faced these questions many times. He will be asking participants how would we determine which functions should be manned in-house and which are best outsourced. We will focus on tips and insights from managers who have tackled similar challenges in staffing and building out teams.

Models

We discussed the pros and cons of ‘in-house’ vs ‘outsourcing’.
The benefits of in-house focused around the in-house knowledge you build up gradually (e.g. better company knowledge around people, tools, market), whilst outsourcing often brings skills and resources not available to the company (e.g. specialist skills, fresher minds, broader perspective). Other points raised was that (sometimes frustratingly so) outsourced resources automatically get an ‘expert’ label not easily obtained if you are in-house.

Challenges

There is a challenge of continuity in how to on-board outsourcing. Outsourcing is often contributing to a larger part in silo. One approach suggested was to create a mission statement to bring clarity around purpose, objectives and stakeholders. One benefit (tied to the above point on experts) is that an outsourced person/entity can afford (and might even be expected) to be more provocative than that of in-house.

Takeaway

There seemed to be broad agreement around rather choosing between in-house and outsourcing, going for a hybrid solution – with a larger focus on outsourcing contributing to in-house team growth/development.

Leading organisations are now using customer journey analytics to deliver improvements in almost every business parameter. From increases in customer acquisition and retention, growth in revenue, better customer experience to improved marketing ROI.

In this discussion, led by Christina Elmark of EPOS, we will draw on the group’s collective experience and explore use cases of using analytics to identify high impact journeys, path to purchase, and uncovering purchase intent early.

We will also ask (and answer) the following questions:

  • How customer journey analytics differs to the analytics we conducted previously? Is it perhaps just a catchphrase?
  • What tools and techniques are most used for customer journey analytics?
  • Do our analytics teams require reskilling to make customer journey analytics work for our organisation?

Come share your customer journey analytics experience so we can emerge with tangible outcomes you could apply at your organisation following this discussion.

The topic was picked mostly because irrespective of company size we all need customers to come in, stay, share our existence with others and be happy with the service/product we provide. We should always analyse and optimise our customer’s journey.

Key drivers for customer journey analysis were:

  • Increasing operational efficiency
  • Enabling customers to self-serve and self-help
  • Growing revenue through cross and up-sell
  • Driving customer loyalty value and acquisition

The group agreed that the quality and frequency of customer journey analysis must increase and improve. No one is doing it phenomenally well – something organisations and employees find difficult to structure and analyse.

The topic is large and with the benefit of hindsight we might have been better focusing on elements of it rather than trying to tackle it all. However, the very concreate examples that were shared on tracking customers from offline campaigns to online traffic were very hands on. But still a struggle to measure effectively whether B2C or B2B.

Most participants were using Google Analytics (both free & GA360) as their main analytics tool. Very few were using other tools such as product analytics tools.

Building and maintaining trust in digital analytics data has always been critically important. The challenge is becoming increasingly more complex in an evolving technology landscape. User behaviour data sets are expanding out from simple web traffic to include a diverse array of customer interactions and attributes. This variety in data types places new emphasis on effective and accessible data management and storage.

Thomas Jacobsen of UNHCR, the UN’s refugee agency, is responsible for the agency’s digital data. Amongst his responsibilities, he is responsible for digital data management, governance and quality control as well as developing the necessary data tools to harvest UNHCR’s rich data sets.

In this discussion, Thomas will ask participants to share their experiences building and utilising data lakes. We will look at:

  • What are the most common and successful data lake use cases
  • The minimum criteria required to build a data lake (people, processes, software, budget etc.)
  • Managing data quality in a world of multiple data collection touchpoints

If you strive to build trust in an elaborate data environment – this session is for you.

  • We started off the discussion talking about the necessary fundamentals for a good data lake implementation
  • Organisations must get to grips with their data model first otherwise a data lake is bound to become a data swamp:
    • Don't: Organise towards the data sources
    • Do: Organise towards the business needs and reporting needs
    • Suggested creating a folder structure
    • Practitioners must distinguish between data storage and data consumption (two completely different things). E.g. the distinction between human and algorithmic users. Could visualise this as data ponds with cascading waterfalls from on step to the next:
      • Consolidated / governed – for human consumption (e.g. analysis)
      • Curated – clean data mainly for machine consumption (e.g. automation)
      • Raw data – just how it comes in (for governance and QA)
    • Most data lakes are built in a file folder format, therefore, if you need to do a lot of read/write it will slow the process down
  • Most managers have a great understanding of what data they have in the lake but their documentation is poor. Consequently, no one else understands the data and how to use it:
    • People do not spend enough time on the accompanying data dictionary
    • Documentation is not the most exciting part of the project but is critical
  • Often organisations flood the data lake. Instead should go step by step
    • Start with fewer tables so data consumers can get to know the data
  • Where/How to create the data dictionary?
    • A wiki could do but no in Excel
    • JIRA
    • Need a folder structure and know how to navigate that structure
  • The more successful implementations carry metadata into tables in the data lake. So have category headers in data rows:
    • Data cataloguing tools are not doing it well yet
    • Open source – Apache project
    • For governance – Apache Ranger
    • Systems mentioned for data management: Databricks (doing interesting things in this area) and Informatica
    • One participant had success with confluence but not a good long-term solution
  • GDPR challenges
    • As some use cases need data subjects’ consent, you need to control how your data lake users can use the data à should only be allowed to use/see data they should
    • Big challenge around the ‘right to be forgotten'
      • Ensure your data model supports right to be forgotten
        • Normally dealt with through Machine Learning systems
      • Data removal is a big issue -> need to orchestrate over multiple systems and data tools
      • Right to be forgotten is the reason why one company represented in the discussion didn’t implement a data lake -> store everything in BigQuery instead. Can delete at row level
      • Removing all data would be the cleanest way to deal with this challenge, however, if the data is large then you might just want to remove the identifiable data
  • Use cases
    • Great for IoT data streaming (big volumes, real / near-real time data collection). Cannot shove such data straight at end users’
    • BI reporting (Lakehouse – a data lake / data warehouse hybrid) – process and ship what is relevant to your warehouse
      • Interface is still to be defined
      • There is data for modelling. And then smaller data sets use BigQuery  (meaning BigQuery is for Lakehouse style framework)
    • Price comparison sites need to get their results in the top ten entries otherwise not seen by customers. So they need a lot of data which is best held in a data lake
      • Good for training models but not great for sharing
      • For smaller data sets BigQuery type solutions are better
  • Some general comments / advice
    • A data lake is structurally different to data warehouse and should not be treated similarly
    • Don’t model your lake on tech – it is bound to fail
    • Ensure the data in the lake doesn’t change unexpectedly
    • One company represented in the discussion created data streams building on the raw data. Keep these streams so you can always backfill/replay the data for reporting and analysis:
      • Schema on read (data lake) is hard. Schema in write (data warehouse) is better
      • Can be automated
      • Keep a description of each field, including access rights

Steen Rasmussen
IIH Nordic

As a discipline User Experience (UX) is focused on understanding your customers’ needs and delivering the easiest possible series of actions to complete an action. Traditionally, UX was design-led. Increasingly, though, analytics is playing a bigger part. From segmentation to Voice of Customer research to Social Media to Journey Mapping - data provides a scalable view into customer decision-making and choice.

With the ongoing change from User to Customer experience the need of not just understanding what the users do, but what they prefer, intent and how they feel about it as customers has been pushed to the top of our agenda.

In this discussion, led by Steen Rasmussen of IIH Nordic, we will consider different techniques for understanding customers and their decision-making, investigate when various methods are most appropriate and share techniques that work well for each method and how data can be used actively to optimise for the customers journey.

We will share experiences, good and bad, of using diverse data sets to augment traditional digital analytics data.

  • At the end of the day the focus need to be on what drives profit, not just on measuring customers. There is too much data and too little time.
  • To be consistent over time start by narrowly defining what a customer is and then focus on measuring just that
  • There is equal if not more value in exploring and measuring the negative experience as the good experience.
  • Remember that often key components of the experience will be lacking. There is no such thing as an amazing restaurant booking for an awful meal. The meal defines the experience not the booking.
  • With experiences it is often better to fast than to be precise. You don’t want to know exactly how bad my experience is, you want to respond and improve.
  • Segmentation is required to understand any journey.

Our Next Virtual DA Hub is on April 24th from 9am CET to 12:15pm

LinkedIn Contacts - Virtual DA Hub April 3rd

Alban Gerome https://www.linkedin.com/in/alban-g%C3%A9r%C3%B4me/
Andrea Juric https://www.linkedin.com/in/andrea-juric-6920615b
Casper Heiselberg https://www.linkedin.com/in/casperheiselberg/
Christina Elmark https://www.linkedin.com/in/christinaelmark/
Dael S. Williamson https://www.linkedin.com/in/daelwilliamson/
Dan Grainger https://www.linkedin.com/in/dangrainger
Fredrik Wahlqvist https://www.linkedin.com/in/fredrikwahlqvist/
Gustavo Cornejo https://www.linkedin.com/in/gustavocornejo/
Harendra Sahu https://in.linkedin.com/in/harendrasahu
Heidi Teschemacher https://www.linkedin.com/in/heidi-teschemacher/
Hugh Gage https://www.linkedin.com/in/hughgage/
Ilona Beliatskaya https://www.linkedin.com/in/ilonabeliatskaya/
Jakob Faarvang https://www.linkedin.com/in/jakobfaarvang/
James Sandoval https://www.linkedin.com/in/jamessandoval
Juan de Oliveira Rubinos https://www.linkedin.com/in/juandeoliveirarubinos/
Lukas Mehnert https://www.linkedin.com/in/lukasmehnert0ppc0sem0seo/
Marco Favero https://www.linkedin.com/in/marcofavero/
Marian Eerens https://www.linkedin.com/in/marian-eerens-6259427/
Marina Medved https://au.linkedin.com/in/medved
Markus Hradsky https://www.linkedin.com/in/markus-hradsky-85ba30ba/
Nadzeya Kalbaska https://ch.linkedin.com/in/nadzeyakalbaska
Nish Wickramasinghe https://www.linkedin.com/in/nish-wickramasinghe
Oliver Schiffers https://www.linkedin.com/in/oschiffers/
Robert Sahlin https://www.linkedin.com/in/robertsahlin/
Thomas Biber https://www.linkedin.com/in/bieberthomas/
Yves-Marie Lemaitre https://www.linkedin.com/in/ymlemaitre/