Settings

Theme

Data will not tell you what to do

mikkeldengsoe.substack.com

184 points by mikkelenzo 2 years ago · 74 comments

Reader

from-nibly 2 years ago

I've never worked at a company where we were anywhere close to being able to A/B testing on customers. We just always shot from the hip. I think if you are doing that kind of testing may e you've run out of helpful ideas?

Talking to real customers and helping them solve real problems is really potent. And you can get more than just the color of a button. You can get the direction your company needs to go for months.

I think part of the problem is that science takes too long. It's like waiting for evolution to play out. You're company is at war with everything, entropy, the economy, your competition, the attention span of customers. Do you have time to science your way to success? Probably not. Do you have time to gamble on your intuition? Barely.

Collecting data isn't bad per se. But you should always be asking yourself if you are solving the right problems before you waste your time on it.

  • traceroute66 2 years ago

    > Talking to real customers and helping them solve real problems is really potent.

    This, this and THIS again !

    Example case (of many I could cite) would be Transferwise.

    They used to be good, but now they've denigrated into a quagmire. Could they be bothered to talk to their customers, or even just send round some box ticking surveys, they might find that out.

    No amount of A/B testing, data lakes or other "data science" buzzwords is going to help them.

    But no, instead its the same old story :

    Rebranding from Transferwise to Wise because, well, I guess that's the usual shit companies do when they've run out of ideas (Aberdeen rebranding to Abrdn is another fine example from the financial sector).

    Doing stuff worse because it benefits the business (read: increase margins) rather than the customer. Transfers take forever. Customer service is non existent.

    Funnily enough it all seems to have started going downhill around the same time they floated on the stock market. Funny that !

    Whilst I am aware that a company's strict legal definition is to put its shareholders first, it doesn't have to be that way, at least not in a blatant manner. Afterall, disgruntled customers don't do much good for shareholder's pockets.

    • silverquiet 2 years ago

      I’m fairly sure there’s no legal requirement for a company to put shareholders first. I’ll sound a bit Marxist for a moment and say that we’ve just so internalized capitalist propaganda that we collectively seem to believe that now.

      • traceroute66 2 years ago

        > I’m fairly sure there’s no legal requirement for a company to put shareholders first.

        You're likely correct that there is no explicit legal requirement.

        However (as I understand it), it stems from the implied requirement that derives from the fact that a company's directors have a fiduciary duty to act in good faith in the interests of the company.

        People who agree with the implied requirement argue that "in the interests of the company" equates to "for the benefit of its members". And so you then ask yourself who are "its members" and that's where you end up at "its shareholders".

        I believe in the jargon, this is referred to as "the common law approach of shareholder primacy".

        Going back to the "legal requirement" front, there is, for example s172(1) of the Companies Act 2006[1], which starts by saying:

             "A director of a company must act in the way he considers, in good faith, would be most likely to promote the success of the company for the benefit of its members as a whole, and in doing so have regard (amongst other matters) to—"
        
        So "must" is in the context of "benefit of its members as a whole", and a director is "only" required to "have regard" for other stakeholders that the legislation lists in (a)–(f). Its a bit of a word-salad, but effectively appears to re-enforce shareholder primacy.

        [1]https://www.legislation.gov.uk/ukpga/2006/46/section/172

        • Retric 2 years ago

          Firstly, financial benefits are only one aspect of possible benefits to stockholders. A company may consider reducing pollution a direct benefit to its owners as that improves their health. There’s quite a lot of freedom for things as cancer research for example benefits people beyond the financial incentives.

          Anyway, the next sections makes it explicit that shareholder primacy isn’t required:

          2)Where or to the extent that the purposes of the company consist of or include purposes other than the benefit of its members, subsection (1) has effect as if the reference to promoting the success of the company for the benefit of its members were to achieving those purposes.

          (3)The duty imposed by this section has effect subject to any enactment or rule of law requiring directors, in certain circumstances, to consider or act in the interests of creditors of the company.

          Shareholder primacy instead stems from shareholders being able to fire management.

          • traceroute66 2 years ago

            Its clearly quite a technical topic.

            But what I would say is, in relation to (2) that you highlight, that "purposes of a company" means that defined in its Articles of Association as created when it the company was formed (and as later amended if that is the case).

            So I would argue (2) doesn't apply to the majority of companies, many of whom are likely operating off template Articles without expanded purpose definitions.

            In relation to (3), interests of creditors, this was brought before the Supreme Court in recent history[1]. My reading of the summary of the judgement would suggest there is a relatively narrow window for being mandated vs "have regard", in particular:

            "All members of the Court agree that AWA’s directors were not at the relevant time under a duty to consider, or to act in accordance with, the interests of creditors"

            [1] https://www.supremecourt.uk/press-summary/uksc-2019-0046.htm...

        • staticautomatic 2 years ago

          Maybe, but the business judgment rule is usually going to win if you want it to. Unsurprisingly, though, it’s rarely invoked to support doing something other than giving capital to shareholders.

  • data-ottawa 2 years ago

    I hate the equivalence of data to A/B testing. If all your data people do is say a button should be yellow then you're not using data effectively.

    You should be using data to invalidate your assumptions, separate the real from the perceived, and to draw those aha moments mentioned in the article. Then use that to prioritize and decide what’s worth iterating on and when its good enough to move on to bigger problems.

    As the article says, data won’t tell you everything, which is why your data people need to also be product people, and not just sql monkeys or phds in a backroom doing analyses nobody will understand or read.

  • fshbbdssbbgdd 2 years ago

    When I worked at a company that did a lot of AB testing, we had a lot of people whose main job was to interview users. This was an input in the development process. Then we tested whatever we came up with before releasing it.

    I suspect there’s a disconnect here where you are talking about smaller, early stage companies. A lot of the time they don’t have the sample size to do proper AB testing, or the resources to do it properly, and they have less to lose. So shooting from the hip is more likely to be the only reasonable choice.

    • disgruntledphd2 2 years ago

      AB testing works well in B2C, but is much harder to make work in B2B.

      You can do it, but you can generally only test really large changes, and often if you have good customer communication you can pick up on what the change means with some interviews and showing the customer(s) what the new thing looks like.

      This is generally much faster and cheaper (and I say this as someone who adores designing, running and analysing AB tests).

  • makeitdouble 2 years ago

    > Talking to real customers and helping them solve real problems is really potent.

    One of the many issues is you only get tot talk to customers willing to honestly talk to you.

    That means you can't hear from potential new customers you wouldn't know were part of your market. You also don't hear from customers who would want to leave you but just haven't put it into words yet.

    A/B testing helps get more insight into what customers actually do (and not what they tell you) and also get numbers on how big of an impact your changes have. The time to wait for the results is insignificant compared to the impact of ill changes in general.

    • tonyarkles 2 years ago

      > One of the many issues is you only get tot talk to customers willing to honestly talk to you.

      In the late 2000s I was part of a team that was developing some pretty incredible software to help chip designers manage the added complexity as features got smaller (context: our customers were freaking out about how hard it looked like 45nm was going to be). We did all of these customer satisfaction surveys and shit like that and got... some decent feedback but mostly just all rainbows and unicorns positive reviews.

      Chip design software is complex and every customer of ours needed some custom integration, which is where my small group came in: the three of us were dual-degree EE/CS folks. We could sit down with the chip designers and understand their workflow and then go back to our hotel room at night and write the integration code to connect our tool with whatever bespoke workflow they had internally. All of that story leading up to the main point:

      The feedback I got talking to random people outside in the smoking area was dramatically more valuable than anything we got from our customer surveys. This wasn't a strategy, I'd just go out for a smoke every hour or two to smoke and there'd usually be a couple of employees out there doing the same. "Hey, I don't recognize you, are you new?" "Oh, no, I'm here helping with the $X integration" "Oh! Hey so maybe you can help me then... in the latest release it looks like feature $X should be able to do $Y but I can't seem to get it to work..."

      Pretty much every time I went outside I ended up learning something new, either an interesting way our software was being used or misused, or some other detail about how these guys' day-to-day workflow worked that we hadn't even thought of addressing.

      We had some customers in Japan, too, where there's a an interesting social hierarchy when having business meetings. Me and the junior engineer across the table couldn't talk to each other directly in the meetings, all of the questions had to go through my manager, and a translator, and a senior manager on the other side of the table... in a big game of telephone even though we were in the same room. After the meeting I would usually go have a smoke and just happen to find the junior guy from the meeting doing the same. "You know, I do speak English... and have a few questions if you don't mind me asking directly" :D

      While I can't recommend picking up a persistent nicotine addiction for doing better user research, I also can't say that I've ever encountered a more organic way to get really good unfiltered user feedback. Surveys, user studies, focus groups, etc... they're all decent tools to varying degrees but don't always get the level of honesty you can get out of someone sharing 5 minutes with you in the smoker's corner.

  • devjab 2 years ago

    In my experience data often becomes the “decision argumentation” role which used to be filled by external consultants from E&Y and similar. Which is both good and bad, because data shows the past and that doesn’t necessarily predict the future.

    Data can be very helpful though. We pull data from the public company records which show earnings to find possible investors. Then we combine that data with our sales data from HubSpot and Microsoft CRM (don’t ask me why we have both) as well as our internal sales systems. Which provides good data points for our sales department when deciding which potential investors to focus on, and shows them how much they’ve already “bothered” people. 10 years ago all of this was basically done by hand, now it’s mostly automated. Which sucks for the data researchers, but since the majority of those used to be unpaid students who now get to actually work on something more related to their studies, it’s mostly a win-win.

    Where data doesn’t really help us is in marketing. Exactly because it’s showing us the past, and while that can be useful, it often hasn’t been very helpful in deciding how to do future campaigns. I imagine a lot of this is also true in other fields which produce content for human consumption. I guess in some areas it will be, but on most “creative” fields the data won’t necessarily show you what people will find “fun” or “interesting”. I think Hollywood, big gaming companies as mentioned in this article and others are sort of struggling with this. Thar is just my guess though as I only have experience with how our marketing department has come to the conclusion that while data is a good measurement of the success of various initiatives it’s not very useful in helping decide what sort of campaign to run next outside of which channels are the best focus, and even then, that also changes over time.

  • internet101010 2 years ago

    The big tests that matter are always worth the wait. Making sure people don't screw it up by changing things mid-test can be difficult. Replacing the "most likely to be successful" with "most representative of potential rollout group" mindset of stakeholders isn't easy either.

    Then you have people that try to get greedy. On more than one occasion I have designed a test where two variables change, results are great, rollout projections are great, the stakeholder attempts to do the rollout without changing the variable that creates incremental expense, and the rollout does not meet projections. Then they reluctantly do what they were supposed to do in the first place and everything is fine.

  • coffeebeqn 2 years ago

    I was at a small games company that was way too small (in audience and headcount) to be A/B testing. They landed at a local maxima that was nowhere close to a sustainable business. The leadership was way too afraid to make actual decisions to actual problems and opted to change the color of the button for a few years until they ran out of money. ‘Twas a tragic waste of a team and money

  • kilbuz 2 years ago

    Much of A/B testing is far deeper than the color of buttons.

    How much faster can we process payments through provider A versus B in different countries around the world?

    If we offer insurance after checkout, do we convert more than offering it before?

    What ranking algorithm of skus leads to the highest conversion?

    • williamcotton 2 years ago

      And what does any of that have to do with finding the right services or products for the right customers?

      Does a customer care if you shave off 2% of the final cost or do they care about having world-class customer support?

      Does a customer care if a product has a higher conversion rates or if it is the product they were looking for in the first place?

      Does a customer care more about how long payment processing takes or do they care that it takes their local mobile payment app?

      The only way to know is to talk to customers. Without doing so you’re just coloring a different variety of button.

      • makeitdouble 2 years ago

        > Does a customer care if you shave off 2% of the final cost or do they care about having world-class customer support?

        Times and times again, they'll tell you the latter and actually choose the former.

        The same bargain as paying more to have no ads: people vocally push for no ads, some will ponny up the money, and the vast majority will make do with the ad supported model, while ad blocking or giving up on the service when they're fed up with it.

        > Does a customer care more about how long payment processing takes or do they care that it takes their local mobile payment app?

        I'm curious how you find the people to ask that ? If your current service doesn't provide support for the payment app, who would you ask if it was a deal breaker for them and refused to become your customer, not giving you anybof their information ?

        • rightbyte 2 years ago

          > Times and times again, they'll tell you the latter and actually choose the former.

          I don't think this is as true anymore.

          There was some time between 1990 and 2015(ish?) where physical widget prices was falling faster then the quality decrease, software and computing hardware got better, where the quoted strategy made sense.

          Nowadays you will get dropshipped crap (or any service sector equivalent) if you go for the lowest price.

        • williamcotton 2 years ago

          I'm curious how you find the people to ask that?

          Market research, talking to people in line at the post office, sending a nice personal email to an existing customer?

      • tomnipotent 2 years ago

        We need a new "appeal to customer" logical fallacy. Talking to the customer is not a panacea for running a business, and the failed company graveyard is full of products that "delighted customers" but still couldn't cut it in the long wrong.

        Part of running a business is having to explain to your boss how you spent millions of dollars, and building confidence that you're making sound decisions and not just shooting from the hip. Many times that will mean making decisions in the best interest of the company over the customer, and there's nothing wrong with that.

        There's nothing perfect about A/B testing, and like any tool it can do both good and harm. But when I have to explain to my boss about how I'm spending their money, there's a limit to how much lip service I can play up about the customer journey before I have to put my money where my mouth is and demonstrate that I'm putting their cash to good use. A story that includes A/B testing along with qualitative customer research is better than a story that just includes one or the other.

  • nonrandomstring 2 years ago

    The direct, old-fashioned approach of;

      - being real
    
      - talking to customers
    
      - actually listening to what people say
    
      - using intuition
    
      - cutting to the chase (big and meta problems first)
    
      - doing risky exploration for abductive reasoning 
    
    is only as good as the nominal culture we're in. As the author says,

    > I’m no longer a believer in decision-by-spreadsheet.

    That's nice for you. Me neither. But every day we must interact with dull-headed data crunchers who set the pace and policy.

    • hef19898 2 years ago

      All of the above needs some basis in reality, aka numbers. Otherwise it is just guess work.

      • nonrandomstring 2 years ago

        What do you think is wrong with guess work?

        • hef19898 2 years ago

          That you are as often wrong as you are right? If you have numbers, good ones, use them during decision finding.

          • nonrandomstring 2 years ago

            50/50? For terrible guessers maybe.

            For experts with tens of thousands of hours experience in a specialised field, with 40 years of case studies to extrapolate from?

            Let's cast it in more relatable terms:

            Such a person is, an enormous collection of data.

            The day will come... soon, when people who "believe in technology" (in the very strong sense) will see no problem putting absolute trust in a neural network trained on exactly that same corpus of data.

            A neural network is of course, a magnificent black box statistics machine.

            And what are statistics machines trained on? Numbers. But they process and relate to them in a fuzzy way.

            What is a spreadsheet and data analytics suite? Numbers.

            Now your human specialist is going to outperform the numbers machine every time. But the human can often not introspect their ineffable knowledge (most expert knowledge is like that; which is why we developed the entire filed of expert systems to make it legible)

            So if we choose to call such knowledge "feelings" of "guesswork" we're making a silly mistake. What does that even mean?

            Neither can the neural network introspect. But we choose to label that ineffable knowledge as "calculation".

            And so you invoke the magical properties of "NUmbers!" (did you mean real or imaginary ones :)

            You see the error we fall into, giving two different labels to the same process only because of what hardware they execute on?

            What I'd really like to talk about is the logical process of discovery called "abduction", but I fear I am rambling already :)

            • hef19898 2 years ago

              I consider myself to be rather good in my field. Which is exactly why I take every bit of data I can get before I provide my opinion or decide something.

              Every situation is different, facts change, so I have to evaluate my opinion each and every time (which is hownypu learn and bevome better). And the more data I have, the easier this is.

              • nonrandomstring 2 years ago

                Like me, you opine and decide. Sounds like you and I are both experts who take advantage of all the tools in the box, numerical, computational and messy wetware. Sorry if you may have taken umbrage with my depiction of "dull headed data slaves". What I was referring to there are people who only use the numbers. For them there are no opinions or decisions. Only calculations. And as per TFA, it is that mentality that stymies innovation and good decision making.

                • hef19898 2 years ago

                  On that, we absolutely agree! Im the end so, those who let some calculation decide for them and those who just throw solutions at the wall, are equally bad at decision making.

                  I know, because I made my share of bad decisions, especially early on in my career.

                  • nonrandomstring 2 years ago

                    Yes me too. All my royal screw-ups came of hubris and imbalance between measurable facts and feelings I ignored. Respex.

                    • hef19898 2 years ago

                      I usually added a fair share of not listening to advice from people I didn't want to listen to for various reasons. Now I try my best to seperate the message from the messeneger as much as I humanly can.

                      Part of the learning experience, I guess?

  • carschno 2 years ago

    And because “science takes too long” (and is expensive, and tedious), people tend to fall back to pseudo-science. There is often no way to derive robust numbers from techniques like A/B testing, for instance because of confounding factors that are not measurable. Given that, I have regularly heard the argument that “these are the only numbers that we have”.

    Relying on such numbers, however, is equivalent to falling back on intuition and gut feelings for decision-making (or worse), while believing that the decisions were based on numbers.

  • Eridrus 2 years ago

    I think the discourse is converging on A/B testing being a big company thing, where I think it is very useful.

    Not because it finds new knowledge, but because it keeps your product teams honest.

    It's really easy to delude yourself and others about your project when your promotion is on the line, and A/B tests let you actually evaluate whether the change helped or not.

    At small companies, you're not trying to find 2% effect sizes, anything that small is already a failure, so you don't need statistics to tell you what worked.

TheAlchemist 2 years ago

As painful as it is to admit, I've come to the same conclusion, after wasting quite a lot of time and efforts.

As Warren Buffett likes to say - "It's better to be approximately right, than to be precisely wrong"

This should be a poster in every company doing any kind of data.

  • ysofunny 2 years ago

    I think in some other context it's actually preferred to be precisely wrong than roughly correct

    maybe this is the difference between a business (or engineering) mindset of "it must work effectively, how and why it works are secondary"

    in contrast with a perhaps more phillosophical (scientific? purely mathematical? reverse engineering?) study goal? in which case whether something works is secondary to having a full theory of what's going on

zurfer 2 years ago

Data is for seeing problems, not for finding solutions. As by the authors example, you lose 10M in fraud? Good that you monitor that! Otherwise it would be hard to justify spending time on it.

VyseofArcadia 2 years ago

I worked for a large company that purports to be data-driven.

I watched that company converge on the blandest, clunkiest, least useful features over and over and over again.

Blindly trusting the data without any product vision is just design by committee at scale.

laichzeit0 2 years ago

Maybe I’m missing something. The goal of an A/B test is to test a hypothesis. Where that hypothesis comes from is irrelevant. Sure you can waste your time testing stupid hypotheses that don’t have a lot of business impact, but that’s beside the point.

  • VyseofArcadia 2 years ago

    No, in big tech the goal of an A/B test to gather data that you can cite in your next performance review so you can increase your odds of getting a good bonus.

  • hdjrudni 2 years ago

    A/B testing requires you to build both A and B. Or if you already have A, B can't be so radically different that any comparison is meaningless.

    • myhf 2 years ago

      If you hire engineers just to deny them to your competitors, then building B is free.

  • kqr 2 years ago

    I agree. The research part of product development should focus on discovering the fundamental laws of the domain. These are fairly constant and pay off for yeara after their discovery. They also help you get a better intuition for the business you are in.

    Hypothesis testing is a great way to discover laws.

__MatrixMan__ 2 years ago

I don't understand. If you don't already have a thing that you're trying to do, then what cause would you have to collect and analyze data in the first place? Did these people hit their head and forget something important?

inopinatus 2 years ago

The value is in the negative space, most commonly, the rejection of an hypothesis.

That is to say: Data can certainly advise you what not to do. Such as flying the ship into that spooky nebula, Captain

fuidani 2 years ago

> Intuition is underrated

> Spend time where your customers are and make your own conclusions.

This is a great article, very well-written, and I enjoyed reading it. However, could intuition and spending time with customers be considered another way of collecting data points to inform data-driven decisions?"

  • wodenokoto 2 years ago

    Generally when talking about data driven you are talking about quantitative data and not qualitative data.

    • epgui 2 years ago

      A pet peeve of mine is when people assume qualitative data is less rigorous. It's really just as rigorous and just as valid, and can sometimes require even more sophisticated analysis.

      Anyone can plug numbers into a formula (there are so few barriers to doing that, that most people probably do it kind of wrong and get approximately directionally-correct results anyway), but handling qualitative data requires really knowing what you're doing from first principles.

    • xeonmc 2 years ago

      Or you could just call it “Natural Neural Network data integration”

richrichie 2 years ago

“ Their research shows that a nonlinear approach drawing from anthropology, sociology, philosophy, and psychology, is better at getting to the moment of clarity”

And all of these come from data. There is no non linearity here; just widening of perceptive funnel.

KingOfCoders 2 years ago

I think data driven is orhtogonal to opportunity driven and vision driven. With the first, data is used to find opportunities, with the second, it shows you if your strategy to your vision is working or not.

Then there is politics driven development. You want to do something and search the trove of data for data that supports what you are doing. Or you look at the data in a strategy meeting, and then ignore it (seen this happen mostly in board meetings of large companies)

  • megamix 2 years ago

    I like this explanation. I wonder about this on a philosophical level -- is data analysis just our defense mechanism to gain control over things?

    Another thing to say is not all companies should or can be vision driven, some companies are just that copy cat.

scott_w 2 years ago

I've worked in situations where A/B testing is heavily used (and useful) and in situations where it was completely useless. Both in the same company but at separate points in the flow.

Where you have enough users (maybe you're a major online retailer), A/B testing should be a vital part of your toolkit. Not the only tool, but you definitely need to test every change you make. If you can gather enough data within 24 hours, why wouldn't you test your change?

That being said, A/B testing isn't the be-all and end-all. It just gives you some information to make a decision. You still need to know your customer, speak to them, survey, observe, etc. You might even pick a "losing" variation with the aim being to reach a more optimal business outcome. Data doesn't give you the right to abdicate your responsibility to make good decisions.

There are cases where A/B testing can't help at all. A great example is in low-volume but critical flows (think SaaS conversion funnels). For these, you need to rely on the other skills you have at your disposal.

iraldir 2 years ago

Another point to support the same idea is data can be falacious.

For instance, you are making a pretty advanced 3D web app, and notice in your analytics that your userbase is only chrome and safari users.

An easy conclusion is to focus your testing on those two platforms, or maybe even drop support entirely for Firefox and Edge by using some webkit specific API.

A not so easy conclusion is the experience might be so bad or buggy on a non-webkit browser that anyone who tries the app in those just gives up on it.

The reasonable truth in this case? You should use standard browser distribution except if you're operating in a specific market, it might also be perfectly fine to drop non webkit browser if the ROI of developing them is not worth it for your goal etc. All of which does not need data but rather intuition and common sense.

  • nicbou 2 years ago

    And a lot of privacy-friendly browsers don't show up in the stats by design.

  • GeneralMayhem 2 years ago

    This is pretty standard survivorship bias - which is to say, a well-studied problem with a famous pithy example (bomber plane armor), that many people and organizations still somehow don't learn.

  • jorticka 2 years ago

    It goes deeper.

    People learn over time that complex web apps work badly in Firefox, because developers mostly test in Chrome.

    So they don't even bother trying it in Firefox.

    • nairboon 2 years ago

      Maybe. Or maybe Firefox users have sophisticated ads & tracker blocker, sometimes even UA spoofing. Then it really depends on your analytics method, if these users show up at all. You might have a lot of FF users, but analytics tells you otherwise.

PCalvanelli 2 years ago

I love this piece. It evokes the well-known idea that "we shape our tools, and thereafter our tools shape us," drawing attention to the concept that digitally represented information serves as an observer's or machine's recorded testimony of a physical or cognitive system.

What is the value of that?

Language alone, or in this case, information, does not dictate our actions. However, there is persuasive power inherent in language — specifically, language that exposes the subjective gains individuals aim to achieve through their actions, often influencing individual behavior.

There exists an unexplored connection between our contemporary understanding of data and praxeology.

newaccount74 2 years ago

I'm a bit torn on this. I think basing product decisions on analytics is a bad idea, because the numbers can only tell you about features your software already has.

But analytics/diagnostics are extremely important to discover bugs, because you can't rely on customers to tell you about them.

Arubis 2 years ago

Data _can_ tell you what to do, but that doesn't mean your data selection and gathering was precise, right, and aligned to your actual best interests.

Data-informed decisionmaking is great. Data-driven decisionmaking, not so much. You still need to trust your gut.

tomrod 2 years ago

Good insights. Before the term was bastardized, then consumed, by Machine Learning, the experimentation component was considered the killer delivery by data scientists (putting the _science_ in the term). Now, most folks assume MLE := DS, rather than MLE ⊂ DS

  • jval43 2 years ago

    OT: there's actually a ≔ symbol, so you could write MLE ≔ DS if you wanted to.

spandrew 2 years ago

I love this article; it's well written and accurate. But in my experience the bigger problem is that companies aren't using data at all.

Data-ignorant decision is a killer, too.

j7ake 2 years ago

Data may not tell what to do next, but models do. For example Bayesian optimisation framework tries to do exactly that.

  • AlotOfReading 2 years ago

    And as the saying goes: all models are wrong (but some are useful). You're going to have to sit down and do the thinking eventually anyway. May as well start by incorporating it into your process from the beginning.

    • bloomingeek 2 years ago

      This is a great insight. In the past we called it fish-boning an idea, looking at every possible angle based on data AND other factors. Data is not the same as knowledge and knowledge isn't the same as wisdom.

oxfordmale 2 years ago

A/B testing is valuable to optimise an existing process once the low-hanging fruit has been tackled, but it should also be discontinued when the return on investment becomes negligible. Because of organisational momentum, this rarely happens, as a team would make themselves redundant. This can result in A/B testing noise.

Data also provides valuable insight in a negative manner. If your conversion rate is abysmal, the data tells you to get out of your cubicle and start talking with real customers to find out what the data isn't telling you. It is still a data-driven decision. It is just a negative one.

However, in the end, data isn't going to find your next billionaire dollar opportunity. You need to find a gap in the market that no one has tackled before, and of course there is no data for, otherwise someone else would have jumped on it.

DeathArrow 2 years ago

Both data mining and intuition have their uses. The key is to recognize when you should use each or both.

kromem 2 years ago

It might not tell you what to do, but done right it sure as heck can point you in the right direction.

megamix 2 years ago

Thank you! This guy has obviously read Nassim Taleb. Here you go upvote!

BlackFly 2 years ago

Even if you can tell with absolute knowledge that changing the graphics of that packaging increases conversion (removing the disgusting graphics of cancer patients), perhaps we shouldn't be selling more cigarettes?

Data informs your values, but your values are a choice. Already from the beginning, data will never define your values. Even with perfect knowledge, your decisions are still going to be a choice. Combine that with the fact that our knowledge is imperfect (our data is incomplete, biased, a single and partial perspective)...

"A company should seek to maximize its profits," is a normative statement, not a truth. It is a choice of values.

thefatboy 2 years ago

unless you listen

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection