Settings

Theme

93-year-old YouTuber back in business after being kicked off platform

cbc.ca

33 points by fbelzile a year ago · 72 comments

Reader

qmarchi a year ago

Disclaimer: Former Technical Solutions Engineer for GCP, aka Support for Customers. Also Former Engineer on YouTube Caching.

To get it out of the way, I do not agree that it should've taken a journalist to get involved to have this situation solved.

However, I'd like to prompt Hacker News with how would you handle receiving support requests from a product that has >2.7B users. Almost all of which are non-directly revenue generating, across hundreds of different languages, in every conceivable location in the world.

It's an extremely hard problem to solve, but I don't think anyone has got it right. I'll be playing devil's advocate in the comments. Keep me busy for my flights.

  • ryandrake a year ago

    In this case, an entire channel was shut down (with no opportunity for appeal) on account of a single instance of someone zoom-bombing a live stream with pornography. Presumably the decision was entirely automated and there was no human in the loop.

    It doesn't matter how many users you have. This "solution" seems like swatting a fly with a nuclear weapon. Why not just take down the offending video until the user takes corrective action? YouTube can clearly identify the offending video out of the non-offending ones, so that's not a technical problem. And it can be done entirely with automation, so it wouldn't need humans. Further, they obviously can tell that the user does not have a history or track record of this kind of activity. Why do these tech companies always go straight to the "no recourse ban hammer"?

    • Cthulhu_ a year ago

      > Why not just take down the offending video until the user takes corrective action?

      I believe this is how Twitter handles / handled rule breaking content, you got an infraction / suspension until you acknowledge and deleted the offending tweet.

      Of course, I believe videos are a lot harder because it's video content which takes more effort to analyse than relatively short plain ish text messages, especially automated.

    • qmarchi a year ago

      One of the biggest things for events like this are: - Regulatory Restrictions, many jurisdictions have limit on how to treat accounts like this - Repeat behavior from others that have breached this segment of the ToS.

      A good automation would be integrating something like what they already have for Music Copyright, where you can automatically trim the segments around the conflicting content.

      • gooosle a year ago

        You keep alluding to these regulatory restrictions which supposedly force this kind of behavior from Google - please go ahead and tell us what those are.

    • sbarre a year ago

      Yeah it feels like there should be an escalating series of consequences for infractions..

      It should have been pretty straightforward to establish a pattern (or lack of) around whether this was an intentionally abusive channel or a first-time offence.

      But of course this costs money and takes time to build.

      • some_random a year ago

        They sorta have this already for copyright strikes, but I guess porn was judged to be a more impactful offense?

        • Cthulhu_ a year ago

          Yes; they cannot take any risks with that, laws around showing porn to people / minors (accidentally or not) are stricter than copyright violations. I'm not a legal expert and honestly I'm pulling this out of my ass, but, porn is a legal matter, but copyright violations is a civil matter.

          There's also the advertisers and payment processors to consider, most of them will have nothing to do with porn. This triggered some major pruning of various websites too, Tumblr and Reddit on the one side, but huge chunks of Pornhub and co as well for any porn that didn't come with the right paperwork, including identification and consent forms from the participants.

  • oidar a year ago

    >> However, I'd like to prompt Hacker News with how would you handle receiving support requests from a product that has >2.7B users. Almost all of which are non-directly revenue generating, across hundreds of different languages, in every conceivable location in the world.

    2.7B users is a lot, but how many of those are established content creators (like say - more than 10k subscribers) that are banned on a daily basis? How many people would it take to review those cases?

  • vundercind a year ago

    > However, I'd like to prompt Hacker News with how would you handle receiving support requests from a product that has >2.7B users. Almost all of which are non-directly revenue generating, across hundreds of different languages, in every conceivable location in the world.

    One way: dollars.

    Another way: don't provide services you can't support.

    The only way it's actually going to happen, and when it does we'll shockingly find it was always possible and companies just didn't feel like doing it: regulation (which will just result in a mix of the first two options)

    • sbarre a year ago

      > don't provide services you can't support.

      Also playing devil's advocate here: Define "support".

      I'm sure Google will tell you that they support their billions of users just fine, relatively speaking, and that the percentage of people who fall through the cracks is an acceptable margin (to them, obviously not to the users themselves).

      To your point about "not providing the service", do you believe that the trade-off that would happen if Google, for example, stopped offering free tier Youtube uploads, would be worth it for providing better support to paying users?

      Would the incredibly massive reduction in uploaded content be worth it?

      Or do we have to live with these kinds of gaps in order to get the rest?

      • vundercind a year ago

        I don't think the vacuum left by Google exiting free video hosting would last long.

        There are lots of potential solutions to the problem, that aren't all very-expensive centralized services, but YouTube existing takes up all the oxygen in the room.

        • sbarre a year ago

          Wouldn't a new player in the space run into the same challenges though? I don't think this is a Google-specific problem (although they are particularly bad at it).

          • vundercind a year ago

            Their moderation challenge is one of centralization and insistence on taking control of promoting videos. Nobody needs to, or does, police my email to make sure I haven't subscribed to any naughty newsletters. (you know, aside from five-eyes programs or whatever—but businesses don't). [EDIT] To pre-empt an obvious objection, yes, spam filtering exists and Google and others will absolutely sometimes blackhole email, but I can tune that personally, and I can use other email services and still get the exact same emails—my point is that it's not impossible for relatively unmoderated content distribution to exist on the Internet—what's impossible is Google doing it the way they've decided to without spending more money than they are, or without screwing over and silencing people for bad reasons far too often.

            Their problems (not the problems of distributing video, but the problems of doing so as a centralized platform beholden to advertisers that also engages in automated promotion of videos) are a choice they made.

            • sbarre a year ago

              Hmm interesting. I sort of see your point here..

              I would worry that a much more decentralized approach would just lead to inconsistent and fragmented moderation, and it would feel arbitrary.. Most creators, and viewers, wouldn't know where to go to find their content.

              For better or worse, YouTube's centralization makes discovery very easy, and incentivizes creators to invest in their work because they see greater returns from such a large audience.

              I'm not sure I agree that in 2024 it's "not impossible" for a business to host completely unmoderated content on the Internet, especially video.. The amount of behind-the-scenes moderation (by humans or machines) that happens on the big platforms has been well documented..

              I think video hosting is something that is hard to do not-for-profit, at least in a way that is approachable for the average viewer (i.e. doesn't have the barriers of something like PeerTube etc)..

              And even if they could host unmoderated content, I think we all know what happens there (see KiwiFarms, 4chan, Rumble, etc)... they become spaces that the average creator doesn't want to be associated with because all of the extremists (of all kinds) end up there.

              • vundercind a year ago

                Yeah, the inherent problems of video hosting are real and highly-devolved decentralized hosting (e.g. IPFS) is nowhere near a satisfactory solution in a world where most end user devices sleep much of the time and need to conserve battery power, but these are separate from the reasons that Google finds humane moderation too difficult to even credibly attempt—the reasons video hosting is necessarily hard, aren't the reasons YouTube moderation is hard. I do agree that discovery would be a problem to solve, but I don't think it's insurmountable. After all, I hear running a search engine can be profitable...

                I think at least separating hosting and the not-necessarily-connected role of curation and promotion (plus, maybe, separating distribution from hosting) would go a long way to solving a lot of problems. That'd basically require regulation for it to actually happen, though, because there's too much value in capturing that entire vertical, effectively "dumping" on parts of that potential-market to feed network effects and build a moat around for whichever part of it (the ad-laden curation and promotion interface, in YouTube's case) you're making money on.

                This is how a lot of Google's—among others'—properties work. It's frustrating because dumping "free" products for the purpose of marketshare-capture in adjacent markets stifles not just interest in other commercial efforts with different funding models, but also FOSS or truly-free hobbyist efforts. I think this kind of thing is also why we basically don't develop new open Internet protocols anymore, or if we do, they don't take off, even when the need is there—they compete with free but deliberately closed and non-interoperable services funded by ads or propped up by other wildly profitable services, so are DOA even if you can convince anyone that trying to develop them in such an environment is worth their time in the first place.

    • bryanlarsen a year ago

      Are you saying that we should kick all the poor people off of Youtube?

      • vundercind a year ago

        I'm saying if you provide a service you can't support, stop and make room for someone else to try.

        Of course they'd rather provide abusive service to hold on to user-share. That's a choice they're making and I'm saying that's a bad choice, socially speaking. Good business choice, I'm sure.

        • bryanlarsen a year ago

          > stop and make room for someone else to try.

          In other words, kick everybody off of Youtube.

          • vundercind a year ago

            If Google'd rather do that than not harm users with no human recourse unless you get the media involved, yes, that would be a more ethical choice than what they're doing now. They could also just stop being bad at the role they've elected to take on, in the first place.

  • tossandthrow a year ago

    It is easy: Allocate the resources needed - hire more customer reps.

    This would cut into margins, but maybe it is not possible to run hyper scale companies only managed by a couple of engineers.

    And maybe we should not accept that profit seeking people want to do that anyways.

    • lolinder a year ago

      > This would cut into margins, but maybe it is not possible to run hyper scale companies only managed by a couple of engineers.

      It wouldn't cut into margins, it would make YouTube wildly unprofitable, with no viable path to monetization that would ever pay for the support burden.

      I realize that some on HN—it sounds like you included—are perfectly happy to argue that if a company can't provide human customer support to every one of their users then that company shouldn't exist, but most of YouTube's users would fervently disagree.

      • some_random a year ago

        There are two things here, first off no one is suggesting that youtube needs to have American call center reps available for every 9 subscriber channel, they're saying they need to have some kind of non-automated support staff. Secondly, this isn't just a youtube thing, google in general is famous for underinvesting in support. There's no reason to think that youtube wouldn't be in the exact same position if it was wildly profitable, like say GCP.

      • taormina a year ago

        That is absolutely hilarious. There is definitely a middle ground between today’s “spend $0 on live human support” and “spend X% of YT revenue on the support a platform like YT demands”. There exists an X% where they are plenty profitable. It’s a video hosting platform subsidized by the largest ad network in the world.

        • lolinder a year ago

          They don't spend $0 on live customer support, they spend X%.

          At their scale there will always be high profile stories like this unless and until they spend enough to support every single user who runs into problems with their automated systems. Spending that much is completely cost prohibitive, so we should never expect to reach a point where we stop seeing stories like this.

          • taormina a year ago

            What’s the job title of that live customer service agent? And how do you reach them? To the observing world, they act like it’s $0. There’s a world of difference in “sure, this is an always an expected edge case” and “oh well, let’s just let our systems systematically screw everyone”. We have this dog and pony show every week. Most days of the week. They don’t only destroy creators who do well on YT, you just don’t usually get critical traction on your Tweet. Then you what do you do? Beg on HN and pray?

      • tossandthrow a year ago

        Clearly, as the parent commentor wrote, they are not able to solve it technically.

        There is a segment on HN, it sounds like you are included, who believes that is is OK to entirely defer to algorithmic governance without any legal oversigt.

    • Cthulhu_ a year ago

      > hire more customer reps.

      For that scale, you're looking at an army of tens of thousands of customer reps - on top of however many they already have. I don't know how Google does it, but FB has a number of subsidiaries or contracted companies across the world that spend their days doing content moderation.

  • some_random a year ago

    I agree completely that it's a really hard problem, but this isn't an unusual case in a non-english language. Accounts are going to be hacked all the time, and all the evidence should be easily available to verify the claim that an attacker uploaded the porn, not the original account holder. There are plenty of other platforms with monstrous support requirements and while none do it perfectly or maybe even well, the popular perception of Google is that they have a policy from on high to not even try. This is across all Google products by the way, ReLogic (developers of Terraria) were locked out of their gmail and canceled their Stadia port over it [1]. I was trying to find a specific story related to GCP, but all I keep running into is people complaining about support [2] [3] [4] These are especially bad because it's affecting customers who are contributing to a very high margin part of the business, and in some cases customers have a line item for the support. Obviously bad experiences get talked about orders of magnitude more than good ones, but Google really is well known for this.

    [1] https://www.techspot.com/news/88563-re-logic-cancels-terrari...

    [2] https://www.reddit.com/r/googlecloud/comments/m3hi63/whats_g...

    [3] https://www.reddit.com/r/googlecloud/comments/1ey0rx8/gcp_su...

    [4] https://www.reddit.com/r/googlecloud/comments/owt679/how_doe...

  • fakedang a year ago

    Well for starters, cut the issues at source?

    The article mentions that the channel was taken down because a hacker in the live Zoom meeting (being streamcast into YouTube) played porn. YouTube could have simply blocked that single YT video while retaining the rest of the channel.

    If multiple instances of users hacking Zoom meetings came to light, Google could simply block Zoom from streamcasting videos into YouTube until they fixed their shit.

  • gooosle a year ago

    You start by not banning people for nothing.

    • qmarchi a year ago

      Except for that you risk breaking laws in several jurisdictions if you don't. And you can't fall back to "manual" enforcement think of the massive capital of human resources you'd have to invest in to make it keep up with the upload rate of YouTube.

      • prmoustache a year ago

        If it is not sustainable, the service shouldn't be run.

      • CamperBob2 a year ago

        Except for that you risk breaking laws in several jurisdictions if you don't.

        Why isn't the market addressing this? Why doesn't someone start a video service in the US and for the US -- one that follows US law, and not Pakistan's or Iran's or India's or Germany's or anyone else's?

        Oh, right, because then they can't sell ads in those places, or use them as tax shelters. Silly question, I guess.

        By forcing compliance with oppressive regimes (and with pearl-clutching sponsors in less-oppressive ones), advertising will eventually end everything good about the Internet. If you work in that business, you're the problem. Consider a different career.

      • gooosle a year ago

        Give me a single example of laws in any jurisdiction that require this user to be banned.

        There was absolutely no reason to ban in this case and you know it. If there was one video where rules were unintentionally broken, and the user has no history of that before, you remove the video, not completely ban the user.

  • levkk a year ago

    Reddit works just fine. Where there is a will, there is a way. I do believe (without proof) that incentives for Google are misaligned here.

    • sbarre a year ago

      So while Reddit is huge, I don't think it's even close to the same scale as Google. And problems at Google's scale are hard to compare to smaller scenarios.

      But I do agree with your 2nd point about misaligned incentives though. I don't think "how do we ensure that every user can get fair support" was ever on any product roadmap for these free global-scale products..

      Or more accurately, the "users" in this case are the advertisers, not the uploaders.

    • qingcharles a year ago

      This isn't correct. I've just been discussing the shadowbanning/appeal system with Huffman because it doesn't work right now.

    • some_random a year ago

      Reddit is definitely a different beast, it's easier to admin and has unpaid community moderators who pick up pretty much all the grunt work

  • mikequinlan a year ago

    Is there any reason that governments should allow that business model to exist? Why can't the government require a reasonable level of support, and if that restricts how big a service can get then that is just fine.

  • criddell a year ago

    Do what Microsoft does (or at least used to do). Charge a fee for support and if the problem is due to a bug or poor online help, refund the fee.

    The risk is of bad incentives when providing support becomes profitable...

  • CamperBob2 a year ago

    It's a social problem, not a technical one. Facilitate a hierarchy of trust among users who more-or-less volunteer to moderate at the lower trust levels and engage on unjustly-banned users' behalf with YouTube staff at higher trust levels.

    Better yet, pay them. It is, after all, work.

    • andrewinardeer a year ago

      Why pay?

      Reddit, a multi-billion dollar company has perfected the art of exploiting unpaid volunteer work.

      So much so that when said workers rebel against the administration they get booted from their position. Moderators are easily replaced as there is always someone willing to toe the administration's line.

  • Brian_K_White a year ago

    Less than 2.7bn:1 user:staff ratio.

    Do you have any other impossible conundrums I can clear up before coffee?

  • prmoustache a year ago

    Do you have actual numbers on how many accounts are banned and how many appeal processes are triggered every day ? worldwide and by countries?

  • papageek a year ago

    All the services are revenue generating. Advertisers pay for eyeballs.

  • ksynwa a year ago

    Seems like you want all the benefits of having monopolised video upload but none of the downsides.

clord a year ago

The automation should be setting flags on videos. Users should have preferences for opting in or out of flags with reasonable defaults. If there is a jurisdictional requirement in a users location YouTube sets the preference to disabled according to the law and shows a link to the regional law so users understand.

Hence abuse is a local thing too. One can be getting flagged in one region but not in another. ‘Abuse’ amounts to getting certain flags auto-applied in some locations or whatever. Should not affect the account itself though.

oidar a year ago

Yet another instance of where the right thing is done by Google only if the journalists gets involved.

  • edm0nd a year ago

    Theres so many stories of legit YouTube creators being just nuked by Google and not being able to get any help unless they are huge channels or the media gets involved. Its really pathetic and sad.

    Do better Youtube/Google.

  • amelius a year ago

    Capitalism is the new authoritarianism.

xmuslims a year ago

YouTube also blocks ex-muslim youtubers. Google has become another evil to deal with for them.

anoncow a year ago

A company like Google should not be allowed to run a company like YouTube. They should be separate entities.

  • deadbabe a year ago

    YouTube should be an entirely non-profit entity, with no interests beyond delivering content as fast as possible.

  • Cthulhu_ a year ago

    I suspect that on both sides of the ocean there's parties clamouring for breaking up Google; Youtube as its own company would make sense, but that's assuming it's financially and technologically healthy enough on its own.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection