Data has repeatedly been tagged as the oil of the 21st century and rightly so. With so much advancement in computing, storage, network, analytics tools, the amount of data that gets generated has grown exponentially. The best part of data is the idea of compounding largely due to the data network effect.
Organizations that are able to harness this data network effect are the ones who are well-positioned to stay relevant. Despite this widely known phenomenon, many organizations are lagging in the race (Analytical laggards) or are struggling to get this data network flywheel moving at the speed that they would like. This is evident from a study conducted by MIT Sloan Review that shows how 67% percent of all surveyed leaders — senior managers or higher, said they are not comfortable accessing or using data from their tools and resources. While this is a resultant of many factors — a blanket term that I use to describe all such factors is the absence of a ‘Data Culture’ or a ‘Data fluent culture’.
Enabling the data network flywheel requires an alignment of the three elements — The strategic imperatives, the organization capabilities, and the data culture
So what is this data fluent culture, what are the benefits, and how do you evaluate whether your organization has the right data culture to enable your strategic imperatives?
Data Fluent culture: The set of values, principles, and methods, in an organization that drives, enables, and enforces all decision-makers to guide their strategic and tactical decisions in the organization based on data rather than intuition or historical experience.
The benefits of a data fluent culture among many other aspects are the dimensions of real-time data-informed decision making that is free of any bias or intuition; enablement of a team culture that fosters collaboration and creates a level playing field where even an intern can respectfully challenge a senior leader; improvement in proximity with customers by highlighting who the right customers are, what are their needs leading to improved customer satisfaction and retention; and finally higher top and bottom lines.
So let us look at a checklist that summarizes the ten signs to evaluate the progress of your own organization along the data culture curve. I regularly evaluate my firm using a weighted average along these dimensions so that I know what needs to be improved:
(1) Do leaders frequently use data (Quantitative and Qualitative) instead of intuition to guide their day-to-day judgment: Setting the right culture and in this case, a data fluent culture requires a top-down direct and indirect communication that starts with leaders adopting the mindset. If an executive meeting is mostly without data and is driven by opinion, it is difficult to nurture a culture in the rest of the organization. So the first thing that I would ensure is that leaders of the organization understand the importance of enhancing the data quality and quality of decisions based on the availability of the data. In this case — data need not be only quantitative such as metrics but even qualitative ones such as user feedback. A good blend of the two is likely to yield a favorable outcome in the long term.
(2) Do your OKRs or Quarterly goals clearly outline the metric that can be measured and does the organization meet every quarter to measure the progress against those metrics and assess quantitatively: This is related to the above point but worth mentioning separately to highlight the importance of setting a goal that can be measured weekly/monthly/quarterly and that does not leave ambiguity about whether you are moving in the right direction. This should be expanded into performance reviews as well to reflect a quantitative measure into individual incentives. Many times managers put qualitative measures in performance reviews that leave a lot of scope for ambiguity and bias.
(3) Democratization of data: How easy is it to ask a question and get an analytics dashboard or metric to answer that question? Does it take an executive to get a response from the team responsible for leading BI? or can anyone either self-serve themselves or get help when needed? Many times, as organizations are developing, analytical asks of executives are satisfied first — nothing wrong with it but if it is limited to only that then surely other members of the team will stop asking questions. At the end of the day, the bottleneck will be the decisions of the team that is actually working on the problem. In many cases, only a few functions in the organization have access to data and not everyone can poke around metrics to generate insights. Most often these are data science team members, leaders, or the analytics function in the company. When this happens, it creates silos within the organization where one part of the organization wants to push data-driven decisions while the rest of the organization is using their gut, intuition, or past experience, leading to silos, division, and conflicts. If you do not see links to the dashboard passed around on company slack every day by all teams then you have not achieved the data maturity needed.
(4) Do you frequently run into the problem of Hippo (Highest paid person’s opinion) or can even an intern challenge an argument based on data beyond the pre-conceived notion of the organization:
The democratization of data creates a level playing field in the organization where fresh graduates and senior leadership can exchange ideas freely and make decisions based on data. The advantage of this is the demarcation of the management rights (given to SMEs) and the approval rights (To deny/approve a project) within the organization so that while the subject matter experts have the autonomy on how to row the ship, senior leaders within the organization can hold the helm of the ship and steer it in the right direction. Enabling this culture is extremely difficult and almost impossible without the presence of a strong data culture.
(5) Does the product team build a dashboard to measure the outcome before they ship features and does the team relentlessly phrase hypotheses and collaboratively define experiments to validate them: When data quality is not good, the product team resorts to shipping features without looking at the impact on the metric. Soon, the team is busy shipping feature after feature without bothering to quantify the impact. If this is happening in your organization, insist that the team spend 25% of their time on measurement — it is always a collective responsibility. Data quality is a journey so start on this at the earliest — it will pay an unimaginable dividend soon.
In my previous article here, I wrote how hypotheses can be used to phrase ideas. When this becomes a culture within the organization, it enables an experiment-driven culture that de-conflicts the environment and triggers the flywheel of the data network effect. Encourage your team to state their hypothesis. A case in point was a situation in my previous company when two leaders argued whether we should have chat on the home page. One lead strongly believed that chat adds friction while the other felt it is a strong touchpoint leading to higher conversion — except that neither had data to prove one way or the other. The team felt conflicted between the two opinions — should they add chat or not? While the two had diagonally opposite positions on the placement of the chat widget, when we phrased the problem as a hypothesis — it seemed both of them were saying the same thing — one the null hypothesis and the other the alternate hypothesis.
I have witnessed meetings after meetings where leads are debating what the right user experience should have been and never shipping anything in absence of an alignment. Phrasing as a hypothesis helped reconcile the difference of opinion, triggered the idea of validating the hypothesis, and we went into running a test in the next 2 hours. What was the result of the test is not as important as the path to getting to running the tests.
(6) Do team members report failure or metrics going south as enthusiastically and as objectively or do they only report positive metrics: I used to lead a metric meeting in an organization where leads would drum up the metrics that looked green and downplay the metrics that were in red attributing it to seasonal effect, some problem with data, etc. My product mentor used to call such meetings — vanity meetings where you go to self-congratulate and not to introspect. Week after week, we would look at data and the new low would become the new norm. This happened not because leads did not want to report the negative news but that their incentives were aligned with positives. This is the classical mistake of hoping for A and rewarding for B — where executives want people to report areas of improvement but fail to create a safe environment where teams can do so. Now whether this is happening intentionally or unintentionally, the sign should be detected and rooted out at the earliest to create a data-informed culture.
(7) Is data science viewed as a team working on their objectives or an enabler of every other team in the organization: Data science and data analytics team should be enablers of every function in the organization — Marketing, Sales, Product, etc. If the data science team is working in silos on a set of their own objectives, re-evaluate the function and role of data science in the organization. Business Insights and Data Science team that I lead at my company work both as an enabler — where they augment decision making of other functions within the organization and also as a check — where they objectively evaluate various metrics of other functions to continuously remind other teams of possible opportunities.
(8) Fosters a fail-fast culture: When I first joined the growth team as a product manager, my early few tests resulted in a huge jump in metrics but a dozen tests that followed failed. When a test failed, I reported the test failed and frantically looked for another hypothesis to test. This is when my VP product asked me, “What do you mean by test failed? What is your hypothesis on why we failed to reject the hypothesis?” Phrasing like that, she helped me understand the true definition of failure, which is not that the test failed to achieve the outcome but that the team failed to set up the test in the first place that we could not generate insights on what follow-up hypothesis we can build from the test results based on data — both quantitative and qualitative. Once the team realizes how our VP product used to see tests that did not yield results, we increased our testing velocity more than 5x — continuously running hypothesis testing, accepting/rejecting hypothesis, and using the insights to guide the next set of tests. This helped us enable the MVT (Minimum Viable Test) methodology to infuse growth. The team in a short span of 5 months with only two engineers, one design, and analytics architected a $5M incremental revenue opportunity.
(9) Does your product and data science team focus on iteration rather than MVP: Minimum viable product has been often used as a means to quickly get out in front of users and collect feedback so much so that MVP is celebrated with much gusto in the company that teams become busy ideating and finding the next mountain to scale. While this works great when you have two teams, one responsible for identifying the right mountain to scale and the other for scaling the mountain, if you have just one team responsible for both then identifying the mountains and moving to the next one leads to multiple sub-optimal local maxima that destroy more value than it creates. A matured data science and product team that is guided by data focus on outcome and iteration.
(10) Data as a culture not as a slide deck: If your Business intelligence team is continuously generating insights only to find no sponsor or champions for those ideas, it is a sign of organizational in-efficiency that requires a closer look. Understanding how often the insights get acted upon is far more important than continuously generating insights that no one is accountable to execute. In summary, it is not the slide deck or insights on those slide decks that create value but the execution on the insights. Identify the right sponsor to validate the insights, identify an independent stakeholder to oversee the execution of the insights, and create a tighter loop between the team generating the insights and the team accountable for the execution.
That sums up my list of the top ten signs that I use to evaluate my team and organization. If you have some other signs that you look at to gauge the data maturity of your own organization, do let me know by connecting with me on Linkedin. Read our other articles on Monetization strategy and AI/ML here.
References:
[1] Thomas H. Davenport, Nitin Mittal, and Irfan Saif (2020). What Separates Analytical Leaders From Laggards?. MIT Sloan Management Review
https://sloanreview.mit.edu/article/what-separates-analytical-leaders-from-laggards/
[2] Zach Gemignani, (2020). Why Build a Data-Savvy Culture? Juice Analytics Blog
https://sloanreview.mit.edu/article/what-separates-analytical-leaders-from-laggards/
[3] Wikipedia contributors. (2021). Data culture. In Wikipedia, The Free Encyclopedia
https://sloanreview.mit.edu/article/what-separates-analytical-leaders-from-laggards/
As a photographer, it’s important to get the visuals right while establishing your online presence. Having a unique and professional portfolio will make you stand out to potential clients. The only problem? Most website builders out there offer cookie-cutter options — making lots of portfolios look the same.
That’s where a platform like Webflow comes to play. With Webflow you can either design and build a website from the ground up (without writing code) or start with a template that you can customize every aspect of. From unique animations and interactions to web app-like features, you have the opportunity to make your photography portfolio site stand out from the rest.
So, we put together a few photography portfolio websites that you can use yourself — whether you want to keep them the way they are or completely customize them to your liking.
Here are 12 photography portfolio templates you can use with Webflow to create your own personal platform for showing off your work.
Subscribe to our newsletter to receive our latest blogs, recommended digital courses, and more to unlock growth Mindset