Settings

Theme

The mermaid is taking over Google search in Norway

alexskra.com

903 points by oarth 4 years ago · 358 comments

Reader

Ueland 4 years ago

I have some experience on this field. Around two years ago i was a DevOp for the company running Dagbladet, Norways #2 newspaper. One of the things I did was keep an eye on mysterious traffic.

I managed to find a huge spam network that set up a proxy service that delivered normal content, but injected "you can win an iPhone!" spam to all users visiting them.

Since I was in the position of being able to monitor their proxy traffic towards many sites I managed. I could easily document their behaviour.

In the same time, I wrote a crawler that visited their sites over a long, long time. I learned that they kept injecting hidden links to other sites in their network, so I did let my bot look at those also.

By this time, I also got a journalist with me that started to look at the money flow to try and find the organisation behind it.

My bot found in excess of 100K domains being used for this operation, targeting all of westeren Europe. All the 100K sites contained proxied content and was hidden behind Cloudflare, but thanks to the position I had, I managed to find their backend anyways.

We reported the sites to both CF and Google, and to my knowledge, not a single site were removed before the people behind it took it down.

Oh, and the journalist? He did find a Dutch company that was not happy to see neither him or the photographer :)

  • avian 4 years ago

    > We reported the sites to both CF and Google, and to my knowledge, not a single site were removed before the people behind it took it down.

    As someone that tried reporting spam sites because they were using content scrapped from my website, I'm not surprised.

    Cloudflare has a policy that they will not stop providing their IP hiding/reverse proxy services to anyone, regardless of complaints. The best they do is forward your complaint to the owner of the website, who is free to ignore it.

    They say "we're not a hosting provider" as if that's an excuse that they can't refuse to offer their service. I'm sure many spam websites would go away if they couldn't hide behind Cloudflare.

    • yosamino 4 years ago

      > The best they do is forward your complaint to the owner of the website, who is free to ignore it.

      Or worse. Since I have no way to know beforehand who I would be dealing with, this is actively dangerous - what if the mobster running this site is having a bad day and choses to retaliate ?

      Also what a stupid fucking policy that is. Even if you are not legally compelled to block content, what is the point of actively helping distibute harmful content?

      What they are doing is worse than just saying "We are not a hosting provider" - because while what is true, they are actively distributing content that is hosted elsewhere while hiding who is hosting it.

      One can easily write an email to abuse@hoster.example.com and usually these people do not want garbage on their networks. CF is making it impossible to do notify them, and they refuse to implement an alternative procedure.

      I still do not understand the moral position of profiting off of enabling criminal scum, when it would be so easy not to...

      • tlogan 4 years ago

        I do not think that it is up to Google or CloudFlare to police the internet. If a site is doing something illegal then report it to appropriate gov agency. If gov agency does not anything then get involved into political process to fix that.

        • hackbinary 4 years ago

          If Google, or CF, or whoever are fronting illegal activity with their services, they are absolutely responsible for damages the party they are proxying.

          Platforms must be responsible for the content they are hosting, broadcasting, and publishing.

          One to one communications between two people exchanging ideas and having a private discussion is different from mass broadcasting.

          • fstrthnscnd 4 years ago

            > Platforms must be responsible for the content they are hosting, broadcasting, and publishing.

            > One to one communications between two people exchanging ideas and having a private discussion is different from mass broadcasting.

            The highway is used both by those visiting their friends and those doing mass deliveries. Is it the job of the highway maintenance crew to control for what purpose their network is used?

            I would like to know if the above analogy stands.

            Edit: https://news.ycombinator.com/item?id=27994831

            • zizee 4 years ago

              The "owner" of the highway is the government, who regulates commercial traffic differently to personal traffic. The government places strict rules on who is allowed to use the highway, and how it is used.

              The highway maintenance crew is akin to the person installing racks for CloudFlare.

            • hackbinary 4 years ago

              Highway are a poor analogy for information broadcast systems in general. Highways are closer to a one to one transmission system rathe than a broadcast system of one to many.

        • franga2000 4 years ago

          They're already removing things they don't like. I see no reason why they shouldn't remove things that are objectively 100% harmful.

          Like seriously, is there a single person on the planet that's going to defend online scams? It's immoral, it's illegal, it benefits no one and harms thousands. And it's not like it's very hard to detect and block either.

        • xorcist 4 years ago

          If someone were to tell AT&T that this call center customer of theirs is in the business of extorting people for money, they'd at least look at it and help law enforcement accordingly. Cloudflare has a talk-to-the-hand attitude until actively forced by law enforcement. That's an important difference right there.

        • pdimitar 4 years ago

          Gov agencies and political processes take ages to do anything at all.

          At this point I'd still like the internet companies doing partial policing of content. At least they'll achieve something.

      • a2tech 4 years ago

        Because criminal scum pay their bills. You don't think 8Chan was on a free account do you?

        The sooner developers realize that Cloudflare is not saving the Internet the better.

        • oblio 4 years ago

          At this point I'm convinced that at least 10% of all legitimate economic activity is actually money laundering for crime organizations, in various forms. I imagine that percentage goes even higher in the financial capitals of the world.

      • cratermoon 4 years ago

        > what if the mobster running this site is having a bad day and choses to retaliate ?

        I wonder if someone with malicious intent could set up a site designed to generate complaints (how exactly would be an exercise for the reader), put it behind Cloudflare, and purposely use the information in the forwarded complaints to harass, abuse, dox, or otherwise harm people.

      • FDSGSG 4 years ago

        >One can easily write an email to abuse@hoster.example.com and usually these people do not want garbage on their networks. CF is making it impossible to do notify them, and they refuse to implement an alternative procedure.

        But that's exactly what CF does. They forward your abuse complaints to the abuse contact of the IP address hosting the content.

      • arthur2e5 4 years ago

        The retaliation is quite real—CF keeps your entire e-mail address and name in there, so you are essentially doxxing yourself. Pretty sure 8chan posted a lot of the reports they got back in the day.

    • eru 4 years ago

      They might take that stance, to avoid liability and complication.

      At the moment, they have a very clear rule. If they stop providing services to obvious spammers, they will create lots of grey areas, and they will also implicitly make a judgement that the client they still serve are _good_ in some way, and an enterprising lawyer or muckraker might exploit that.

      • LudwigNagasena 4 years ago

        Cloudflare dropped the Daily Stormer. The ship of pretense of no judgement has sailed.

        • Litost 4 years ago
        • sneak 4 years ago

          This may have had something to do with the fact that the daily stormer was claiming prior to that that their lack of suspension was an implicit endorsement by CloudFlare of their site and content.

          Misuse of trademarks is a thing.

          I agree, however, that CF's policies are applied arbitrarily.

        • iratewizard 4 years ago

          And 8chan, the 4chan alternative where anyone can make and moderate their own board.

          • hnbad 4 years ago

            Better known for being linked to the Christchurch and El Paso shootings, being the origin of the QAnon movement and having a history of hosting child pornography.

            https://en.wikipedia.org/wiki/8chan

            • iratewizard 4 years ago

              Facebook, reddit, MySpace and Twitter have all been linked to mass shootings and child pornography. None of these sites condone, enable or remotely desire such content.

              • hnbad 4 years ago

                > None of these sites condone, enable or remotely desire such content.

                Yeah, and that's the difference, isn't it? 8kun might not condone any of these things, officially, but it very much enables and desires them.

                This kind of discourse is seen as the "price of freedom", its presence a demonstration of absolute tolerance and blind faith in freedom of speech. Facebook, reddit, MySpace and Twitter are more strictly moderated and impose actual terms of service on their users' freedom of expression.

                But of course this also means the people most motivated to join networks that offer guarantees of free speech absolutism are those whose discourse is not tolerated by these mainstream alternatives. And their presence will almost guarantee an absence of "normies" who don't run into the limits of their freedom of speech on the moderated networks much and feel uncomfortable around the former group.

                Heck, the only reason 8chan ever became large enough to be widely known was because 4chan evicted Gamergate. And 4chan isn't exactly known for its strict moderation and suppression of political views.

                • iratewizard 4 years ago

                  Do you know why 4chan evicted Gamergate?

                  • hnbad 4 years ago

                    Enlighten us, I'm sure your explanation will reframe 8chan in a way that makes it seem a lot more respectable.

                    • iratewizard 4 years ago

                      Moot was trying to be friends with the people in that circle. A girl he was trying to date didn't like it. He was taking awkward babysteps towards his lackluster job at Google where he would never be promoted or accomplish anything meaningful again.

      • avian 4 years ago

        How is that different from a hosting provider that has to address legal complaints regarding spam, copyright infringement, etc. on their servers? Just like a hosting provider, they specifically have a relationship with the website owner to provide the reverse proxy service. It's not like they can say "we don't know who or how our service is being used".

        It seems to me that if they want to be in this business they have to deal with these liabilities and complications, not hide behind some vague "our hands are tied" language.

        • studentrob 4 years ago

          Presumably if illegal content is not taken down by the customer then the host cancels the service, right? Otherwise the host risks liability. That's different from revealing the IP of a customer which requires a court order.

        • eru 4 years ago

          You have a point, but I assume those businesses' lawyers understand this better than our armchair speculation here.

        • lelanthran 4 years ago

          > How is that different from a hosting provider

          If their argument is "we only retransmit what we get, with caching" then they are in the same place liability-wise as the phone providers ("We only retransmit what we et, with caching").

          In other words, a common carrier.

          Hosting is different. For exmaple, Youtube is not liable for what their users upload. They comply with takedown notices because they host the content, not the user.

          • breakingcups 4 years ago

            But in a way, they actively host the content. The fact that their server periodically retrieves new content from a different backend makes no difference. The page sits on their hard drives and is server by their servers when I visit that domain. It's always been a very, very thin argument and it has gotten even thinner with the likes of Cloudflare Pages and Workers.

            Cloudflare is just a huge company actively ignoring abuse complaints and somehow they are getting away with it. It even helps their PR to a certain market segment.

            They even still host kiwifarms, a board that is primarily known for its vicous harassment of people and is known to have driven multiple innocent people to suicide.

            I consider CloudFlare a bad actor at this point and I wish the other big names around them would too. They are subsidizing crime with VC money.

      • IfOnlyYouKnew 4 years ago

        This logic doesn’t make sense. Nobody is under the illusion that CF is somehow incapable of denying service to individual customers.

    • mattbee 4 years ago

      This policy even extends to stresser/booster/DoS-for-hire services services - try searching for some and see who fronts them?

      20 years ago the transit providers of the internet would have spotted Cloudflare for what it is, and cut it off.

    • guest159835 4 years ago

      I've been reporting hundreds of spam sites to Cloudflare, but always get the same lame excuse. Godaddy the same. Meanwhile good content drops in Google rankings and spam moves to the top.

    • FeepingCreature 4 years ago

      That seems like the sort of thing that should require a judge's order.

      • trangus_1985 4 years ago

        Cloudflare is not a public institution. It troubles me that they get to define, draw, and then maintain that line.

        However, I do agree - privacy unveiling like that should require a judge's order.

        • stavros 4 years ago

          But they don't, that's explicitly their stance. There is no line. They host everyone equally. To do the opposite would require drawing a line.

          • yosamino 4 years ago

            That is not true. They do have a line specified here https://www.cloudflare.com/abuse/

            It's just that the procedure is so useless that it might as well not exist.

            • fauigerzigerk 4 years ago

              IANAL, but I don't see a Cloudflare specified line anywhere on this page. I think this is just the bare minimum they are legally required to do.

          • trangus_1985 4 years ago

            > There is no line

            You are missing the point of the complaint, which is that it's a private decision to hold that policy. Maybe it was a bad idea to use the word "line", but the intent still stands unadressed.

          • account42 4 years ago

            > They host everyone equally.

            Everyone except those that are too right wing.

            • stavros 4 years ago

              Are you referring to the one incident where they stopped hosting a racist hate site and then vowed to never take sides again?

              • account42 4 years ago

                Yes. Also to the incident where they stopped hosting 8chan after they vowed to never take sides again.

                You can agree with Cloudlare not providing services to those sites as much as you want, but you cannot pretend that Cloudflare hosts everyone equally. They cannot use that as an excuse to not deal with spammers.

              • LudwigNagasena 4 years ago

                Well, that one incident shows that they don’t host everyone equally. A very simple and obvious conclusion.

              • nextlevelwizard 4 years ago

                "He has never murdered anyone" "Are you referring to the one incident where he shot a racist hater and then vowed to never murder again?"

                All I'm saying is that we won't know until they come under pressure again

                • stavros 4 years ago

                  > All I'm saying is that we won't know until they come under pressure again

                  That's also true of people who haven't murdered anyone yet, though.

                  Whom do you trust more? The person who did something and vowed to never do it again, or the person who didn't vow anything? I tend to prefer the former.

                  • wccrawford 4 years ago

                    When it comes to murdering someone, I'm going to prefer the person who has never murdered anyone yet.

                    When it comes to service providers, I would tend towards your direction. They did a thing that had conflicting ethics on each side, weighed the outcome and their ethics, and then made a hard decision for the future. What they did could be reversed, too, and didn't cause much permanent damage.

                    Murdering someone is very permanent and should take a lot more initial consideration.

                    • lupire 4 years ago

                      this metaphor is absurd. the actual murderers here are the contributors to the banned sites, nor cloudflare, and there were a lot more than 1 murders.

        • studentrob 4 years ago

          Yes, it is akin to revealing the IP of a user on a social media site.

    • andyjohnson0 4 years ago

      I'm pretty sure they stopped providing services to a neo-nazi site a few years ago. A decision that I am completely happy with btw.

    • nine_k 4 years ago

      This is very rational of them. They position themselves as a pipe for "bytes", not "content".

      By ignoring the content they serve, they rid themselves of the necessity to analyze and judge what they serve. Not only would this require a brain the size of a planet and the expense of running it, but also would inevitably conflict with someone else's judgments, and bring various PR woes.

      They don't analyze the internals of their traffic the way internet backbone providers don't analyze the internals of the traffic they pass around.

      I frankly find this position superior: imho it does more good by preventing censorship than harm by serving good-intentioned and bad-intentioned customers alike.

    • rowanG077 4 years ago

      In fact I completely agree with that stance. It's not cloudfares job to police the content. They provide a simple service. If something is unlawful law enforcement should go after the owners.

      • withinboredom 4 years ago

        And how, might I ask, do you propose to do that?

        • rowanG077 4 years ago

          Law enforcement can get a warrant to get information to try and find them for example. They can hire security experts. They can do tons of things.

  • dylan604 4 years ago

    That sounds like a hell of an investigation, and now my curiosity is running. 100k domains sounds like an huge amount of logistics on their side to keep it all running. It would be interesting to read about how a spam company manages that kind of infrastructure compared to a "legit" company.

    Legit company will always have internal struggles between dev/sales/marketing, so things just take longer and are much more draining to accomplish. I'd imaginge spam org just needs to have bare minimum stuff up to satisfy whatever need it is they have knowing that humans won't necessarily be perusing those domains, yet it's 100K domains. I could almost see something like this running more smoothly. I can also see it being run by small number of people that let things lapse and it's just barely hanging together. So many questions...

    • tluyben2 4 years ago

      It is not very difficult to manage: a company of mine was bought by a squatter (I found out after dealing with a broker for the sale; I had to integrate it with their 'tech team' and walked away after) and for many years already, this all has been fairly easy to automate. The registars have apis, cloud flare has apis. There was 1 tech guy keeping it all up and running and he didn't have to do anything. It would register and provision with content automatically. There is really almost no work involved besides keeping money in the registrar account and the costs are only the domains probably, maybe they have a little hetzner load balanced setup with 2 machines but that's likely it.

  • tikiman163 4 years ago

    The reason you found so many domains is that they intentionally take down thier spam sites and reload them under a new domain every few hours. They do this so they can't be taken down by people reporting them as spam. They literally setup the next domain while the current one starts being used so they can do a live swap to the next one without interruptions to thier spam operations. This is typically done in an effort to spread Trojan malware to anybody running computers with out of date operating systems and browsers. Windows getting people off of Internet Explorer has been a huge hit for them as it reduces the number of possible vulnerabilities someone might have when they get sent to one of these Trojan spam sites.

  • ultimoo 4 years ago

    > By this time, I also got a journalist with me that started to look at the money flow to try and find the organisation behind it.

    Very curious to know what you found!

  • lifeisstillgood 4 years ago

    Can I just clarify?

    There is / are organisations that a) scrape legitimate sites for content, b) host that content on their own 100K domains, c) sit behind cloudflare, d) do some seo??? e) when someone finds their site they then inject an ad or similar rubbish f) do this enough that they make money off the ad / competition / porn ?

    That seems like a problem that the ”original-source” metatag was supposed to stop?

    • tyingq 4 years ago

      Canonical urls help with noting your own purposeful duplicated content. But that meta tag goes on the duplicated content. So it doesn't help with scrapers, who strip that out.

      • lifeisstillgood 4 years ago

        But I thought that it was useful for google - who could find two caches with same content, one of which was 2018 one of which 2020 and both say "this is canonical". At that point the 2018 version is real and the other rejected.

        Then again, you could just do it with publication dates ...

        • tyingq 4 years ago

          I don't know why, but Google seems unable to figure out (or just doesn't care) "who published it first". I've seen it be confused many times.

  • pepy 4 years ago

    Do you want to get to the bottom of this? A friend of mine is a top Dutch lawyer with an interest in these things.

keyme 4 years ago

Google search has progressively deteriorated in quality over the last 10 years, to the point where I see it becoming useless in the relatively near future. And it's mainly not even their fault.

I've been using Google search for all kinds of research for 15 years. There used to be a time when you could find the answer to pretty much anything. I could find leaked source codes on public FTP servers, links to pirated software and keygens, detailed instructions for a variety of useful things. That was the golden age of the web.

These days, all the "interesting" data on the Internet is all inside closed Telegram chats, facebook groups, Discords or the rare public website here and there that Google doesn't want to index (like sci-hub, or other piracy sites).

The data that remains on SERPs is now also heavily censored for arbitrary reasons. "For your health", "For your protection". Google search is done.

  • omega3 4 years ago

    > And it's mainly not even their fault.

    It's precisely their fault, they've created an environment that incentivizes low quality, irrelevant content and are actively hostile towards users. Two examples just from top of my head: ignoring the country website, previously if you wanted to search only local news it was very easy to do. Another was ignoring completely the exact phrase search with double brackets.

    • spaniard89277 4 years ago

      Ignoring double brackets drives me crazy. That's the last straw that sent me to DDG, although I have to say that DDG isn't much better either.

      What made me really angry aboyt Google Search was when they removed their function to search in discussion forums. But even then you could more or less filter out crap.

      Nowadays it feels very hard. I find myself using the site: flag many times, but you need to know the site beforehand, which is another problem.

      • BitwiseFool 4 years ago

        I also feel like some product manager decided that having a blank results page is horrible. So even if I put terms in quotes, and there are no results with those quoted terms, Google decides to show me results that have virtually nothing to do with what I want to see.

        • throwuxiytayq 4 years ago

          Except a blank page is exactly what I want to see if there are no results or if I mistyped my query. These shoehorned-in results throw me off every time because it takes extra mental effort to reinterpret them as "oh, google has no results for what I typed in past this point, so they're showing me random crap". I miss the old days when search was as precise as a scalpel.

          Maybe I'm naive about the complexity of the problem (every article I read about the difficulty of what Google's doing certainly suggests so), but I honestly believe that we've reached the point where a talented and well-founded startup could outplay Google at their own game.

          • BitwiseFool 4 years ago

            >"blank page is exactly what I want to see"

            I literally couldn't agree more. I can't stand how bad searching has become.

            While we're at it, you know what else I really hate? How google switches the order of the buttons for Images, News, Shopping, Video, etc. on EACH QUERY. Who in the world ever thought this was a good idea?

            • throwuxiytayq 4 years ago

              I've always assumed that this is a bug in their A/B tests, because I cannot even imagine how utterly degenerate their product design process must have become to come up with this on purpose.

        • aasasd 4 years ago

          Exactly, it's very easy to see how Google doesn't leave the user unspammed. YouTube's search works the same way, and even if there are useful results on top, they quickly trail off into clickbait garbage. Plus the unrelated lists of ‘people also watch’, injected every few items. The search filters are barely enough to dial in when you want to skip obvious trash, but give up on anything slightly complicated. On Play Store, it's worse: you just get troves of what Google thinks you should be getting, with no control on your side—because if people could skip apps with payment inside, they would, and who in Google wants that.

          • BitwiseFool 4 years ago

            I feel like YouTube's goal is to always get you to watch something else. Scroll down in search results? See unrelated videos. As soon as the video starts? See a 'Recommended' badge. Pause the video? See an overlay with other videos. Leave the video running? Autoplay fixates on something else.

      • GuB-42 4 years ago

        I think there is a market here. "Dumb" search engines, that search exactly the words you type, maybe with advanced features like regex, metadata search, etc... It won't replace Google's guesswork, but sometimes, I just want to grep the internet.

        All non-Google engines are all about privacy, which is nice, and almost a requirement if you want to compete with Google, but I'd like to see features that actually improve search too. DDG gets a honorable mention with its bangs and applets.

        • BitwiseFool 4 years ago

          I'd like to go a step further and hope for "dumb" search engines that are tailored towards indexing specific subsets of the internet as a whole. As an example, imagine a search engine that is specifically tailored towards programming questions. Or one that specifically omits some of the more annoying SEO optimized results, like Livestrong and USA Today.

          • zomglings 4 years ago

            My current product started off as precisely that kind of search engine.

            User adoption is a huge problem - almost no users made it their default search engine because even programmers need to do non-programming searches and it's too easy to go into your browser, hit ALT+d, bang out your search query, and hit enter.

            And because google and ddg do a good job on most programming related searches, they get to be the default search engines.

        • lubesGordi 4 years ago

          I think so too. It'd really be nice to get a search capability that doesn't take my past searching into account. I want an unbiased search, and give me good tools to filter.

        • WarOnPrivacy 4 years ago

          > DDG gets a honorable mention with its bangs and applets.

          Yeah but it ignores most other operands. It's fairly frustrating to be unable to mandate a search term.

        • quijoteuniv 4 years ago

          «I just want to grep the internet” +1

      • soco 4 years ago

        Yeah, I also try DDG first then and only if not okay I go !g. Now I wonder, why did GOO break such a useful thing? They could have shown advertisements also in their "classic" search (let's call it like that) so I'm really at loss - what was in it for them to change??? Germans have a word for that "verschlimmbessern" (or even two words - kaputtreparieren) which means breaking something by trying to make it better.

        • input_sh 4 years ago

          > Now I wonder, why did GOO break such a useful thing?

          As far as I understand it, they want to catch synonyms and different tenses for the words.

          But they do a remarkably shit job. nginx and apache2 aren't synonyms, but completely different tools for the same job. Yet apache2 instructions appeared as a match when I've used "nginx" in my query (the word apache2 was in bold in result snippet).

        • spaniard89277 4 years ago

          I guess it has something to do with Google Internal dynamics. But I'm not the one being paid big bucks to think about such stuff. Maybe they've done their due diligence and made their tradeoffs, but I'm clearly not the target of the search engine anymore.

          I'm only a dumb nobody, so if this is a problem for me, I wonder how it is for all the smart people that hangs out here at HN.

          It just feels very uphill to use Google right now. No matter how many flags or tricks.

        • Snarwin 4 years ago

          The median Google search user probably never learned to use any of these "advanced" features in the first place. For them, having Google ignore the precise wording of their query and show results for more common related terms is almost certainly an improvement.

    • remus 4 years ago

      > It's precisely their fault, they've created an environment that incentivizes low quality, irrelevant content and are actively hostile towards users.

      I think this is an overly harsh take. I strongly suspect that any algorithm for ranking search results is open to gaming and manipulation by malicious users.

      • account42 4 years ago

        Google changed SEO from a seedy practice to something they actively encourage, promote and support.

        Google stopped shitcanning sites that that present different things to Googlebot and regular users, including sites that require a login for normal users but show content to Googlebot.

        Google imposed arbitrary ranking criteria that favor long-wided blogspam over concise articles that immediately tell you what you want to know.

        Yes, this is their making.

    • account42 4 years ago

      > gnoring the country website, previously if you wanted to search only local news it was very easy to do

      Also the opposite: insisting on pushing local and localized results on google.com even when I set my browser language to english.

      • DoingIsLearning 4 years ago

        They used to have google.com/ncr

        'ncr' stands for no-country-recognition and it did what it said on the tin.

        Of course like all useful power-user features it got deprecated for the natural language query non-sense we have today.

        • usr1106 4 years ago

          Nowadays you need to VPN to the target country. For a reason to complicated to explain here I searched local businesses in the city of Melun, France. There were no reasonable hits. Well, my IP was Finnish (to my best knowledge they have no other means of localizing me) and "melu" means noise in Finnish with "melun" being a common form. No addition of French shopping terms could convince Google that I am not interested in noise abatement. Accepted language header did not help. After switching to a French IP it worked like a charm. And one would guess searching for shopping and businesses would be Google's strength.

    • yreg 4 years ago

      How do any of those make people talk inside closed Discord groups instead of the open web?

    • BizarroLand 4 years ago

      I'm sure public human SEO manipulation is at least partly to blame. The only thing that is surprising is that it isn't worse than it is. At least the first half page is usually close to what you want.

  • Adrig 4 years ago

    One of the last use case for Google is being a proper search engine for Reddit. But I think they are aware of their downfall, that's why the top of the page is increasingly taken by their widgets to provide directly the information.

    On the other hand, Youtube is the second most popular search engine and I don't see it slowing down. What an insight they had when they bought it.

    Edit : I entirely agree to the fact that valuable information is found more in communities nowadays. I also predict that the web in 5 years will be mostly explored through communities

    • the_duke 4 years ago

      > that's why the top of the page is increasingly taken by their widgets to provide directly the information.

      Another reason for that is user retention.

      If you get your information directly on google.com, you won't navigate away, probably search again, and bring in more ad revenue.

    • wil421 4 years ago

      When I’m looking for reviews of a product I usually type XXX review Reddit to avoid the XXX top 10 list blog spam that google returns. I don’t want a review from someone who just jumbled together a top 10 list without ever looking at the product in person.

    • tonypace 4 years ago

      YouTube search is regressing quickly. They're losing there too.

  • nuker 4 years ago

    > Google search has progressively deteriorated in quality

    49 out of 50 review sites are now just affiliate links to Amazon. “Check the price on Amazon” buttons is the main content there

    • wccrawford 4 years ago

      I've noticed this a lot lately. There are words on the page that look like a description of the product and a review, but once you really read them you see that they could be generated by a bot and they don't actually review the product, just describe the basic properties of it. Then they provide that button.

      • topicseed 4 years ago

        True, although Google has been knowing about this issue and has released guidelines alongside a "Product Review" algorithm update three or four months ago.

        Let's see if things improve in the near future.

      • nuker 4 years ago

        And many are on first page of Google search, the “best search engine” lol. We get what we pay for it. Where is pay for web search startups?

  • mojzu 4 years ago

    I think it depends on what you’re searching for, for dev related stuff no other search engine I’ve tried comes close. But there are whole industries now that are so heavily SEO’d that finding useful information without knowing the exact keyword to search for is incredibly frustrating

    • kall 4 years ago

      I agree, and I‘ve read the opinion too that it‘s a problem people have with DDG. Yet google doesn’t feel excellent at that. Could it be worth competing with google there? I‘m not gonna say it‘s "easy", but maybe worthwhile and possible?

      I don‘t think I have used more than 1000 different sites in all development searches ever. It‘s the stack exchange network, github, official documentation, non-github official issue tracking/communities and some high quality blogs. That seems very manageable. You could probably index that into one elasticsearch and one sourcegraph instance. Add a little more specific faceted search, add back powerful and precise query syntax and still maintain "just past in whatever and hit the first result" functionality. I‘m likely underestimating the breadth of other developer needs than my own. I don‘t know.

      • mojzu 4 years ago

        I think a tool like that could be very valuable, as you said in most cases you end up in the same few common locations. Most of the time the reason I fall back on google is because I'm not sure whether what I'm looking for is going to be in a github issue, in a bug tracker, in a forum, in a stackoverflow answer, in a mailing list, etc.

        There was a docs aggregation site I tried at one point that was quite useful, but without search across issues/forums etc. I didn't end up sticking with it

  • jacobolus 4 years ago

    Google scholar search is still very useful.

    DuckDuckGo is nowadays more useful than Google for my web searches.

  • smusamashah 4 years ago

    You should try yandex.ru for all that interesting stuff. They don't censor any of it.

  • kkoncevicius 4 years ago

    Google seems to also place less emphasis on search phrases. When searching for exact article names I easily find them on DuckDuckGo, but not on Google. Two recent search-term examples:

    1. the scientific worldview needs an update

    2. from reproducibility to over reproducibility

    • account42 4 years ago

      In general, google no longer primarily searches what you asked them but for what they think you want. This might be better for the average user but can be extremely frustrating when you are trying to find something more niche.

      • BitwiseFool 4 years ago

        I think it is also the result of the whole "Ok, google" voice assistant push. It seems like Google switched to natural language processing and the old-school system of keyword searching is no longer effective.

      • GekkePrutser 4 years ago

        Yeah these algorithms are so stupid. They always assume you want more of what you've seen before. For me it's usually the complete opposite.

    • yetanotheralexn 4 years ago

      Your examples seem to work for me (the second one only if combined with double quotes). Do you have more? https://snipboard.io/PYhNHW.jpg https://snipboard.io/HvRaiE.jpg

      It would be cool to find datapoints for a proper bug report for Google :)

      • kkoncevicius 4 years ago

        Well, there is always stuff Google thinks you shouldn't read about. Try these (both are first hit on ddg.gg and nowhere on google):

        - Politics Influences the Science of COVID-19

        - Ten elements of false covid narrative

        - Josh Mitteldorf unthinkable thoughts

  • cratermoon 4 years ago

    Whether or not it's Google's fault depends on how much you attribute the development of the advertising-driving distraction factory internet to Google's business. We can debate whether or not Google was ever really in the search engine business – certainly at one point the search was a useful tool. Today, Google search is a sort of glorified Yellow Pages*. Their main product is selling ads in this Nouveau YP. The results their search engine returns are now heavily skewed towards revenue-generating sites. Such sites may incidentally be informative, but they are generally selling something.

    Edit: see this other HN story: https://news.ycombinator.com/item?id=27993564

    This is not to say that all search results are bought, although of course those are present now, too. But overall Google presumes that whatever the user is searching for, the best result is one where the answer is "buy this thing".

    For those search results that don't lead directly to commercial products, the revenue generation is indirect: through the collection of user preferences and activity, Google can refine its search results towards maximizing revenue. At the very least, the result is likely to be a site that has ads, some of which generate revenue directly for Google.

    *In the old-fashioned Yellow Pages book, you couldn’t really “search,” but there was an index by category. It had many of the issues inherent in categories, but it didn’t take an expert to find things. Google search eliminates the needs for anyone to understand a taxonomy of businesses.

  • herbst 4 years ago

    Google only recently started to totally butcher the Swiss search results. For some reason I could still find direct download links to movies and music a few years ago (kinda legal here).

    Now such search results often don't even get a second page...

  • IfOnlyYouKnew 4 years ago

    If 90% of what you’re searching for is keygens and „inside closed Telegram groups“, it might just be time to grow up?

janmo 4 years ago

I've seen the same here in Germany but they do appear only if you use the results within the last 24h functionality. It looks like the German content is generated through GPT2 or 3. It makes no real sense if you read it. If you go on the page you are immediately redirected to a scam just like the article mentions. Interestingly they use ".it" domains here. It also looks like the domains might have been hacked or are expired domains that have been bought.

For example if you check havfruen4220.dk on archive.org you can see that it appears to have been a legitimate business website before. https://web.archive.org/web/20181126203158/https://havfruen4...

How do they rank so well?

I've checked the domain on ahref and it has almost no backlinks. But if you look closely you will see that all the results that rank very well have been added very recently. On the screenshots in the article you can see things like "for 2 timer siden" which means 2 hours ago. It looks like google is ranking pages that have a very recent publishing date higher.

Edit: Here is what the content of such a site looks like: https://webcache.googleusercontent.com/search?q=cache:Bk0VsM...

  • adventured 4 years ago

    Typically Google has a warming/trial period for new large content sites, after their search bot is introduced to the content and has spidered its way through the site.

    For example there used to be a very common content farm system, that was structured like like this:

    https://domainsites.com/site/nytimes.com

    So when people searched for sites by domain name, the zillions of low traffic long-tail results of this farm system would be all over Google's results.

    What it would present on the page is a mess of data about nytimes.com, such as traffic, or keywords pulled from the site header, maybe a manufactured description (or pulled right from the site head), sometimes images / screenshots of the site. Anything that could be stuffed in there to fill up enough content to get Google to not do an automatic shallow content kill penalty on the content farm. This worked for several years very successfully until Google's big algorithm updates, 9-10 years ago or whatever now (Penguin et al.). You could just build a large index of the top million domains (eg Alexa and Quantcast used to provide that index in a zip file), spider & scrape info from the domains, and build a content farm index out of it and have a million pages of content to then hand off to Googlebot.

    So initially such a farm will boom into the search rankings, Google would give them a trial period and let out the flood gates of traffic to the site. Then Google would promptly kill off the content farm after the free run period expired and they had figured out it was a garbage site.

    I still occasionally see this model of content farm burst up into traffic rankings, and it's usually very short lived. It makes me wonder if that's not more or less what's going on with the Mermaid farm.

  • kostecki 4 years ago

    This definitely looks like an expired domain that was bought. Havfruen seems to be a restaurant in the city of Korsør - which conveniently have the postal code of 4220.

  • NorwegianDude 4 years ago

    .it pages are used in Norway too, but I'm not sure it's something GPT-ish that's being used. Whole sentences are copied word for word from other articles.(might be a small dataset it's trained on?)

    It could of course be that its something similar to GPT that is trained on all the content it could find and then writes articles, cause it's clearly messing up sometimes, form the small piece of content available at the search results page.

    I'm not sure if this is an ML race and the reason we're not seeing the same thing in English is because Google might understands English better than spammers. While in Norwegian and German it's the other way around?

    Clearly freshness is a large part of it. Google seems to have indexed millions upon millions of pages tied to this in the last 24 hours.

  • ROARosen 4 years ago

    Seems like not a new thing. Here is a warning tweet from beginning July from Danish Cybersec guy @peterkruse who saw his name coming up for a different domain owned by the same registrant as havfruen4220.dk

    https://twitter.com/peterkruse/status/1410895961803665410

  • nmstoker 4 years ago

    I presume "GPL" was an autocorrect from the intended "GPT" right?

    • janmo 4 years ago

      Correct, it was a typo

      • dylan604 4 years ago

        I don't know. I've tried reading the GPL2 & 3, and a lot of it just sounds like lawyer gibberish to me that could easily be attributed to GPT

  • MrUnderhill 4 years ago

    Interesting, I've been seeing the same spam for Norwegian searches, but with the domain nem-multiservice dot dk, or nem-varmepumper dot dk - presumably another legitimate business' domain that expired and was grabbed by the scammers. Visiting those domains show the same graphic as shown in the article.

    Almost any search in Norwegian will have obvious scam sites like these in the top 10 results.

    Other domains part of the same scam that show up in my results today: mariesofie dot dk, bvosvejsogmontage dot dk

    I wonder if it is related to this: https://www.dk-hostmaster.dk/en/news/dk-hostmaster-takes-102...

    • NorwegianDude 4 years ago

      Yup. Those domains are the same thing, and redirects to the same thing. There are even more domains.

      Never seen anything on this scale before. I can search for basically anything(tax rules, baking, stocks, property, hygiene...) and Google will most likely show those domains somewhere.

  • e_carra 4 years ago

    I had similar experiences with: https://www.xspdf.com/resolution/51859292.html

    The content seems taken from other websites and mixed in a nonsensical way. It comes up frequently in my search results. www.xspdf.com has completely unrelated content and seems a separate business.

ricardo81 4 years ago

Poor man's cloaking

curl -A 'Mozilla/5.0 (X11; Linux x86_64; rv:78.0) Gecko/20100101 Firefox/78.0' 'https://havfruen4220.dk' > 1.html

curl -A 'Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)' 'https://havfruen4220.dk' > 2.html

diff 1.html 2.html 7d6 < <script>var b="https://havfruen4220.dk/3_5_no_14_-__1627553323/gotodate"; ( /google|yahoo|facebook|vk|mail|alpha|yandex|search|msn|DuckDuckGo|Boardreader|Ask|SlideShare|YouTube|Vimeo|Baidu|AOL|Excite/.test(document.referrer) && location.href.indexOf(".") != -1 ) && (top.location.href = b); </script>

  • gvb 4 years ago

    The "diff" output (above) needs an extra line break to avoid HN automatic line wrapping. The output of the diff command is:

    diff 1.html 2.html

    7d6 < <script>var b="https://havfruen4220.dk/3_5_no_14_-__1627553323/gotodate"; ( /google|yahoo|facebook|vk|mail|alpha|yandex|search|msn|DuckDuckGo|Boardreader|Ask|SlideShare|YouTube|Vimeo|Baidu|AOL|Excite/.test(document.referrer) && location.href.indexOf(".") != -1 ) && (top.location.href = b); </script>

NorwegianDude 4 years ago

I've noticed this daily.

Would be interesting to see the actual content. Based on the small snippets in the search results, it takes content from other sites, like large Norwegian news sites, and somehow outranks them hard.

I wonder what the Google Search Console looks like for that domain, considering that it's probably getting millions worth of free traffic.

EDIT: After looking more at it, it's insane how much it ranks for and how well. Straight up brand names seems to be the hardest to compere with, at least larger ones. Those seems to be around page 4-5 for me.

Some brands I was unable to find at all, but ironically another .dk domain showed up in it's place that did the same thing. There is also some .it domains using the same content.

I've found that it takes contents from multiple sources and glues it together in sometimes great ways. Like one sentence from this page, another thing from that page.

Maybe this is some ML that collects content and pieces a lot of it together sentences or half sentences to one large article? It's clearly from completely different sources, but about the same thing.

Example: "wash car"

Result in google: "A dark winter with snow and salt is hard on the car, and it's extra important to wash the car" - Collected from one article.

<some other text>

"Keep the pressure washer at 30-50 cm from the car..." - From another article.

Ironically, there is like 11 results all tied to this thing outranking the original articles(those are last), even if it's medium to large sized well known companies selling for billion(s) of dollars each year in Norway.

Sometimes it goes from one thing and switches to something completely unrelated, so I guess the spammers still have something to improve.

Weird.

weird-eye-issue 4 years ago

Some data on their traffic from some SEO tools I pay for:

Ahrefs: 230k organic traffic valued at $124k SEMRush: 558k organic traffic valued at $355k

These are estimates and can be widely under or overestimated but they show that this is happening on a very large scale.

For a quick idea on how this is possible I looked at their top pages (according to Ahrefs). Their top page is ranking #2 for the keyword "interia" which has 207k searches per month in Norway and is rated as 0 (out of 100) for being easy to rank for. Usually when a keyword has that amount of searches it would be incredibly hard to rank for, I've never seen anything like this. So what is happening here looks like they are just taking advantage of a market with really low competition keywords.

  • NorwegianDude 4 years ago

    Interia is a large polish web portal, from what I could find. Norwegian people doesn't know it, but polish people might. There is probably around ~2 % polish people in Norway. It also ranks as #1 for me. It's in polish too, so basically only ~2 % of Norway would understand it.

    However, the weird thing it that it steals content from articles, and then outranks them. Most pages seems to be boosted, maybe as a result of it being new. (Most content is just hours old)

    Could you check these too? (exactly the same thing, but newer, it seems) www.mariesofie.dk nem-varmepumper.dk

    Clearly reused domains.

    • weird-eye-issue 4 years ago

      The keyword data was based on searches in Norway alone, it is an order of magnitude higher in Poland. In Norway almost anybody could rank for that keyword if they tried due to the difficulty being different based on location and language.

  • Ueland 4 years ago

    Sidenote but what do you think about Ahrefs? I'm doing some tests to see how easy it is to get ranked for keywords (with actual helpful content, not crap like this thread is about), but i find the Adsense keyword tool not that helpful as they delete many keywords when you search for them, which kinda voids that tool.

    But I currently feel that paying $100/mo for Ahrefs for something I do as a side project is a tad wasteful.

    • weird-eye-issue 4 years ago

      You need a tool like Ahrefs or SEMRush for competitor analysis and keyword research. One trick with Ahrefs if you want to be frugal is to pay for the $7 trial and use it as much as possible during the trial to do your keyword research and cancel. Technically if you are efficient enough that trial could get you months worth of content at least.

    • 55555 4 years ago

      ahrefs is the best in the business.

gnyman 4 years ago

Pet theory (disclaimer that I know very little about SEO) would be that the website with the cloned content loads fast and does not load 4 MiB of javascript, thus beating the original content in ranking mostly because of the speed, which is I believe a important factor in Google rankings (and getting more important).

And add to that the some link spam and preventing the visitors to return not get any bounce back...

Either way, I can't help to be a bit impressed by the SEO spammers outsmarting the people at Google. (Edit: and I don't mean to say they are smarter or anything, just that they only need to find one weakness in the algorithm while the people working to improve it needs to make it works for everything.)

  • jmiserez 4 years ago

    Once the hard requirement on speed impacts the quality of results it no longer helps me as a user. I'd rather have the sites invest their time in good content and wait a few seconds rather than get fast but low quality SEO-ed results. Same with AMP, the quest for speed doesn't make my experience faster if I still load the original page (which is often still necessary).

monday_ 4 years ago

Not sure how relevant this is, but the animal characters in the top image are from a Russian children hit cartoon "The Smesharicks" (literally "The Laughballs").

Schnurpel 4 years ago

If I would run a global infrastructure company like Cloudflare, I also would not take any sides, and leave my service open to anyone. The world is full of people who get upset about something. However, if I declare a hands-off policy, it must be truly hands-off. Cloudflare kicked off Switter https://www.theverge.com/2018/4/19/17256370/switter-cloudfla..., it banned 8Chan https://blog.cloudflare.com/terminating-service-for-8chan/ , it banned the Hacker News https://mobile.twitter.com/thehackersnews/status/66900183605... . That’s not how hands-off works.

  • notRobot 4 years ago

    To be clear, that's not HN, but The Hacker News, a different website, known for... dubious reporting.

bigpeopleareold 4 years ago

I hate dealing with this and now refuse to use Google now when I saw patterns in search results while I was researching common things (like housing) in Norwegian, here in Norway. I rarely use Google these days, but I thought for a second that Google might be better with search results than DDG in Norwegian, but this stuff is aggravating. This is one of those where they screw around with history that you just have to start fresh again on whatever you were doing instead of going back.

edit: one other thing I have seen, but it doesn't mean it is always spam. All The Words In A Title Are Capitalized - it's something to pay attention to whether it is spam or not. Conventionally, titles are usually not like that in Norwegian.

  • eitland 4 years ago

    > edit: one other thing I have seen, but it doesn't mean it is always spam. All The Words In A Title Are Capitalized - it's something to pay attention to whether it is spam or not. Conventionally, titles are usually not like that in Norwegian.

    Another big one is that Norwegians like Germans write words together, just one example from one of the stupid ads: "Spesial Reportasje" is a dead giveaway not only because of the capitalization.)

    (Oh well, sadly because of pressure from Words incompetent spell checker over years and lenient teachers this is getting worse. I fear we are seing compound damage here as kids that got away with this are now becoming teachers...)

    • eru 4 years ago

      In German there was actually quite a lot of historic development about whether to write words together or separate or with hyphens.

      The current state of formal German will surely not be the end of history.

      See https://de.wikipedia.org/wiki/Leerzeichen_in_Komposita#Gesch... (Might need Google Translate, if you don't speak German.)

      • rvba 4 years ago

        For someone learning German as a foreign language separating the words would really help. Even if it leads to things like "Trink Wasser fur Hunde" (as mentioned in the Wikipedia article).

        Hyphens or spaces are still better than those long words...

        • eru 4 years ago

          Yes, I can see that. Pervasive hyphens would resolve most of the ambiguity and make it easier to learn. (But they also look kind of ugly.)

          Just be glad you ain't learning Turkish or Finnish, though.

          • CRConrad 4 years ago

            I find the difficulty of learning Finnish is being consistently exaggerated on the Internet.

            Source: Learned it as an adult myself.

            • eru 4 years ago

              Oh, I'm not saying it's difficult (no clue whether it is). I'm saying that they have long words.

            • qkls 4 years ago

              What languages did you speak before learning Finnish and what level are you currently at?

              • CRConrad 4 years ago

                > What languages did you speak before learning Finnish

                In order of skill, or chronological? C: German, Swedish, English, and French. S: Swedish, English, German, and French.

                > and what level are you currently at?

                "Level"... Lived and worked here 26 years (longer than in any other country), usually speak only Finnish with colleagues. (Exception: my previous job, 2014-18, at an unusually international company; quite a lot of English there.) What number is that, on whatever scale you were thinking of?

                [EDIT:] IOW, it's gotten to the point where I fear my actual native language is only my fourth-best any more, Finnish having pushed it off the podium. [/EDIT]

                Anyway, my point was: I may have improved a bit since, but was probably quite close to my current "level" after two or three years. It really isn't all that humongously difficult as it's made out to be.

                The logic of a multi-inflected agglutinative language may feel unusual at first, but once one gets used to it, it's just that: logical. The orthography and especially the pronunciation of Finnish is the most straightforward of all the languages I've dabbled in (smatterings of perhaps half a dozen more besides the ones I speak). And I think, above all, it has the fewest cases of "this is the rule, but the exceptions are this, that, and the other", where you just have to learn by rote that "this word works that way, but that word works this way", of any language I've come across. Learn the rules and you know it; no exceptions to learn.

                • qkls 4 years ago

                  Sori jos kuulosti hyökkäävältä, kiinnosti vain kuulla lisää miten suomen oppimisen vaikeus koetaan :)

                  I'd say that it's fluent level then. I've met people that have lived in Finland for 20 years but can't still form sentences in Finnish.

                  The "Finnish is hard" trope is probably a mental block, Finnish probably seems hard on surface but it's very logical if you dive deep into it.

                  • CRConrad 4 years ago

                    No kyl se näyttää hyvin avaruusoliokieleltä ellei sitä ainakin vähäsen jo osaa, joten kai ihan ymmärettävää...

        • IfOnlyYouKnew 4 years ago

          Compound words are about 70% of the fun we have.

    • bigpeopleareold 4 years ago

      This reminds me of the facebook group: Bilder i kampen mot særskrivingfeil: https://www.facebook.com/ettord :D

      (Something like: Pictures in the struggle against mistakes when using spaces between words)

      • eitland 4 years ago

        There's also the Norwegian "Astronomer mot orddeling" ("Astronomers against word splitting") that is very open to non-Astronomers as well.

  • bigpeopleareold 4 years ago

    Just want to add to my comment also that it is not limited to havfruen4220.dk, but clarifies a general pattern. I tried a couple of search terms like 'mattilbud rema 1000' and found more .dk domains on the second page (nem-varmepumper.dk, humanrebels.dk) - two things that have nothing to do with food.

the_biot 4 years ago

For all that Google search has been utterly crap for going on a decade now, I have to admit part of the reason is that they get targetted relentlessly by SEO spam operations like this. I like DuckDuckGo for now, but I imagine as they get bigger they're going to be a target for these kinds of spam just the same.

  • boomlinde 4 years ago

    > they get targetted relentlessly by SEO spam operations like this.

    Why, though? There is an arbitrary ranking system that seems increasingly independent of what I actually searched for. Google had created a game where the winner isn't necessarily relevant or at all useful. It's inevitable that spammers will play that game.

  • skinkestek 4 years ago

    > I have to admit part of the reason is that they get targetted relentlessly by SEO spam operations like this.

    A bit of it is probably that.

    Outright ignoring my queries: +, doublequotes, "verbatim" and all takes more than SEO tactics, it takes someone inside Google, either malicious or more probably incompetent on the inside.

    Or more probably: someone was so busy trying to use AI in searches that no they haven't had time the last ten years to consider if it was smart.

    • logicchains 4 years ago

      >Or more probably: someone was so busy trying to use AI in searches that no they haven't had time the last ten years to consider if it was smart.

      Or maybe Google started applying "We know better than the users", the driving principle behind their software and libraries, to their search.

  • fauigerzigerk 4 years ago

    Is there really any difference between DDG and Google when it comes to SEO spam? If there is, I sure haven't noticed in spite of using both, often for the same search terms.

    It seems to me that the techniques used to spam Google's index work just as well on Bing's index.

  • raverbashing 4 years ago

    Even worse, getting this kind of spam through to DDG (Bing?) seems easier than on Google

    It seems DDG is worse at finding the more authoritative sites about a subject compared to Google.

    • shuger 4 years ago

      That's an advantage. Since google tuned up their engine to treat authoritative results as better their searches became absolute dogshit.

      You search for a very specific thing and all the results are big sites that have said something that contains two of the 6 words you search for in a completely generic article that helps you none.

      My favorite is when your query contains a word that is the very essence of what you search for and google chooses to display results without it so you have to do extra click "yes I actually want to search for what I said I want to search for".

  • rvba 4 years ago

    Because they automated anything and you cannot contact any human from quality assurance.

  • beebeepka 4 years ago

    Google search has been a brochure for a long time now

dhosek 4 years ago

The ones thing I want more than anything from google or DuckDuckGo or anyone really is the ability to give a list of domains and never have their results show up in my searches. I know I can do this on a per search basis but I want it to be a configurable setting.

  • mattwad 4 years ago

    UBlacklist is a plugin that does this. It's so great to be able to hide all those sites that just cache Git issues and SO posts.

    • RileyJames 4 years ago

      Just to add to that, uBlacklist has a power feature called subscriptions. Which is massively under utilised.

      It enables a collaborative effort in blocking spam / low value domains.

      If you make a block list, please submit it to the list I’ve made: https://github.com/rjaus/awesome-ublacklist

      (There’s no great subscription discovery as yet)

    • nickysielicki 4 years ago

      Oh man, this plugin is going to save me hours of time over the next 30 years. Goodbye forever, cplusplus.com

    • dhosek 4 years ago

      I installed it and it's—ok? For search results where the spam overwhelms the signal (it used to be able to do a decent reverse phone lookup by putting a phone number into Google), you end up with empty pages or mostly empty pages in the search results. Better than nothing, but it really should be a feature from the search engine, not a browser plugin.

  • eitland 4 years ago

    I used to have a text document on my desktop containing a list of domains that contained autogenerated content, each with a minus in front, like:

    -stupidautogeneratedcontent1.com -stupidautogeneratedcontent2.com etc

    I figured sooner or later Google would pick up the signal but I think instead they just started ignoring my "- requests" as I stopped using them. edit: or maybe they fix the problem. Spam sites used to be a problem during the early decline of Google. I think what happened was that problem actually almost disappeared for me and was replaced by irrelevant results from non-spam-sites

    Edit: mahalo.com was one of those, https://en.m.wikipedia.org/wiki/Mahalo.com

  • niutech 4 years ago

    Just filter out results using uBlock Origin like this:

       google.*##.g:has(a[href*="example.com"])
matsemann 4 years ago

Yeah, I've seen this domain a lot lately. But I've complained about the Norwegian results for years [0]. For most searches there will be a result that's just keyword spam ranking high. Retried my "pes anserinus bursitt" search now 2 years later, and two results are spam from havfruen, and there are some other results from https://no.amenajari .org which is also just translated and scraped content for all languages google seems to love, as I've seen it for years. A third domain I often see as well is "nem-varmepumper". Apparently a site about heat pumps has content on everything.

Can't fathom Google not catching this..

[0]: https://news.ycombinator.com/item?id=21621099

  • porbelm 4 years ago

    When I try that search, havfruen is seventh place. NHI and other good results at the top.

    YMMV a lot with Google results. For me, it's usually great where DDG is kinda crap, but not as bad as... shudder ... bing

    • fogihujy 4 years ago

      With DDG, I found this thread. Google set to Norway as region/language found nothing from havfruen4220.dk, unless I specifically added site:havfruen4220.dk in the search.

      My guess is that someone at Google reacted.

      • matsemann 4 years ago

        Almost all my searches from the last days still show havfruen as a result somewhere. My pes anserinus above. Or "obos fellesgjeld" from a ~week ago which was when I noticed the pattern first. "monstera jord" gives lots of translated blogspam, and then a row of havfruen results.

        Switching language on Google has basically no effect. Sometimes I want to find Swedish results for a thing with the same name, but no matter what I do I get Norwegian results ranked first. So don't think this is easily emulated from abroad.

        • fogihujy 4 years ago

          Yeah, that might be it. Finnish Google is rather useless in general -- especially for Swedish results -- and I expect the same for Norwegian ones.

      • eitland 4 years ago

        Just unintentionally confirmed it was still there for me when I searched for Roblox gift cards.

bash-j 4 years ago

The last time I accidentally installed malware on my computer was when the top Google result pointed me to a site masquerading as the official site for the software. That thought me a lesson to pay attention to the domain name.

hayksaakian 4 years ago

Interesting because it shows that bounce-back is a more significant ranking factor than before.

It seems like they've manipulated rankings by locking people in to reduce their bounce-back stats (in addition to keyword-stuffed content)

  • ma2rten 4 years ago

    I don't think it necessarily shows that. Their good ranking could be completely unrelated to bouceback.

    • wokwokwok 4 years ago

      Who knows? It's a black box after all.

      ...but, you know. Can you see anything else they're doing that would give them that kind of ranking? These pages are just piles of crap, and google is pretty good at filtering that sort of stuff out.

      If it was that easy, google would be filled with spam everywhere.

      The chance that someone did something random thats very uncommon (block back) and it happened to be a super effective signal to google seems:

      a) like an edge case they didn't think of

      b) like it'll get fixed pretty fast

      c) not that unlikely.

      Compared to, say, the idea that some random spammers have built a network of incredibly sophisticated ML-generated pages that can subvert googles algorithms which seems:

      a) not substantiated by any obvious content on the pages

      b) requires a very high level of sophistication which seems totally lacking

      c) very unlikely

      ...but I mean, who knows right?

      We're all just speculating. I guess it'll get fixed soon, and we'll never know.

      • Miraste 4 years ago

        Everyone and their mother blocks back buttons. Major news sites do it. There is no way that's what's ranking them this high.

        • chopin 4 years ago

          Sites should be punished into oblivion for doing this. Why do browsers even allow it? Is there a legitimate use-case for this?

          I am maintaining an SPA and the only thing I do is trying to not pollute history. But I'd never try to block the back button.

          • ma2rten 4 years ago

            I think a legitimate use-case is asking for confirmation before leaving the page when filling out a form so that the user doesn't loose their data.

            • chopin 4 years ago

              But this can be achieved with other means. I have exactly this on my SPA.

    • purplepatrick 4 years ago

      I agree. Tons of sites employ the bounce-back avoidance tactics, and these don’t particularly help their ranking (in fact, lots on non-ranking sites do it — presumably just to keep you on the page)

    • soheil 4 years ago

      I wonder why Yandex opens every link in a new window. How can they track bounceback?

      • adventured 4 years ago

        You could do a slightly more difficult, less direct sequence check on the specific user.

        If they circle back around to Yandex in N time and go hunting for the same query or similar query, then you can rank the prior attempts as not having been ideally helpful (downrank the result/s they clicked through to when they last searched for that query two minutes ago).

        • soheil 4 years ago

          Makes sense, does DDG do the same? If yes isn't that against their "We don't track users" mantra? If no how do they improve their results while missing such a powerful signal?

          • adventured 4 years ago

            I would be fairly certain that DDG isn't itself tracking users in a manner that they can use that ranking approach. They don't need to.

            If you do a search for the same terms in DDG vs Bing, you'll find that the results are very similar. DDG lets Microsoft do the dirty work of abusively tracking users to max out on ranking factors, and then DDG reaps the benefit. DDG doesn't need to get its hands dirty, because someone else is doing so much of that for them.

            By leaning so heavily on Bing, DDG is a blood diamond merchant of privacy. They might not own the mines or directly command the labor, however they're quite happy to buy the blood diamonds to further their own profit afterward. And DDG's users go along with the scheme, because buying into the con helps them sleep better at night. It works like this mentally: those users over there (at Bing) are having their human right to privacy violated, I know it's going on, and I directly benefit from the search data training as I use DDG, but hey at least it's not me being abused, so I can do my virtue signal dance and sleep well at night comfortable in my compartmentalization.

            • berkes 4 years ago

              Your entire rant hinges on on the premise that Bing and DDG ranking are the same. They are not and you can easily check that for yourself.

              Especially not for terms that would have a personal vector, like an ambigouous 'Ruby Gems' or controversial 'effectivity of mouth mask'

      • bobuk 4 years ago

        98% of Russian sites uses Yandex Metrika. It's really easy to track bounceback if you control both search engine and web analitics tool :)

        Actually Metrika is quiet big even outside of ex-CIS countries, IIRC it's like 15% of GA in terms of number of sites.

      • power78 4 years ago

        Doesn't the opening site have access to the child window/tab in Javascript? Can't it set events to fire when the window is closed?

  • FeepingCreature 4 years ago

    That seems automatically testable. Load the site in simulator, then look at the URL history.

rwmj 4 years ago

I've also seen this, but from a different side. I have Google Alerts for many open source projects that I run, but in the past few years these alerts have become all but useless. Spammers scrape genuine pages from all over the place (including ones containing references to my projects) and put them into scammy ".it" domains. These appear both in Google Alerts and high up in Google Search. So alerts and search both become useless. The scam appears to be that when you visit these web pages they say you're the billionth (or whatever) visitor to Google and you've won a prize, just type in your bank details.

This has been going on for years now, so I don't have much confidence that Google is able or willing to fix it.

l0b0 4 years ago

WHOIS shows it's registered four weeks ago by someone in Riga, Latvia.

hoppla 4 years ago

The recaptcha process should be reversed. The sites should prove to humans that it’s content is not generated by bots.

  • ant6n 4 years ago

    Perhaps a search engine that deranks pages that monetize visits (like ads) would be a good first step.

wdrw 4 years ago

Interesting, the image seems to contain characters from a Russian childrens' cartoon ( https://en.wikipedia.org/wiki/Kikoriki )

fny 4 years ago

Somewhat related: has anyone else noticed a massive change in breadth of results? I was searching for reviews for diving equipment and some less niche items and I feel like I'm being spoonfed results from the same comparison engines. Since when did algo content become king?

  • estebarb 4 years ago

    I feel the same. Looking for specialized topics with Google is now very difficult. Now is impossible to look for phones, uncommon words or looking for anything that is not the mainstream result.

    I'm not sure if the culprit is BERT or using neural ranking. But in the last years I feel that is more common that I leave Google search without useful information. The worse part is that all the competing search engines are using the same algorithms that are only useful for mainstream results.

    • fy20 4 years ago

      I noticed this in my country when searching for somewhat less common parts (electronics, car parts, tools, etc). The first few results are for online retailers in my country, and then after that it's full of domains with paths such as /sale_12345678. The domain sounds somewhat promising, and the description sounds good - other than it often being a quantity of 10 - but when you click the link it just redirects to AliExpress.

    • ffffwe3rq352y3 4 years ago

      I find that using another search engine in that kind of situation is extremely useful! If I'm searching for more mainstream stuff google usually is great but when I'm going for more specialized topics duck duck go will usually bring up some different links!

      • HideousKojima 4 years ago

        Pretty much the only alternative to Google Search is Bing. That's even what DuckDuckGo uses behind the scenes.

        • lordnacho 4 years ago

          Does this actually work though? Wouldn't the major search engines more or less look at the same information?

          Or is there some thing that causes Bing to show different results? Perhaps the scammers build a network that targets google because it's bigger?

        • slacktide 4 years ago

          I’ve been using Yandex more and more. Better search results, less censorship. Thanks, Rooskies!

          • Cipater 4 years ago

            Yandex reminds me of what Google was like in the early 2010s. It just gives me the results of the search term I put in.

            Google increasingly thinks it knows better than me what I'm looking for.

        • ffffwe3rq352y3 4 years ago

          Yeah thats why I said to use it!

      • foobarian 4 years ago

        Welp, time to dust off that HotBot codebase and get it running again! /s

    • infogulch 4 years ago

      Search engines seem to be stuck between serving two roles: 1. An easily accessible directory of mainstream information, and 2. A specialized tool to find the diamond in the rough. It seems like it has to be a tradeoff, it can't serve both roles equally well.

    • dukeofdoom 4 years ago

      This happens for unpopular events too.

      Memory Hole

      "The alteration or outright disappearance of inconvenient or embarrassing documents, photographs, transcripts, or other records, such as from a web site or other archive. Its origin comes from George Orwell's "1984", in which the memory hole was a small incinerator chute used for censoring, (through destroying), things Big Brother deemed necessary to censor."

      https://www.urbandictionary.com/define.php?term=Memory%20Hol...

    • visarga 4 years ago

      > I'm not sure if the culprit is BERT or using neural ranking

      Tools are not to blame here, it's like blaming the compiler for the behaviour of an application. Starting with the training data and ending with how the model is used in deployment it's the blame of people who made it, not of the neural architecture. The architecture itself can learn anything you throw at it, good or bad.

  • ajsnigrutin 4 years ago

    Atleast you get the results you are looking for... I search for three keywords, and it chooses to ignore the two specific one, and show only the one general one (while puting a line under the search result, that the result does not contain some keywords).

    Basically, like searching diving suit thickness, and google ignoring "suit" and "thickness" (until i specifically put those two words in quotemarks), and only showing me results for diving.

    • sunshineforever 4 years ago

      I play a game with google search: I take something very mainstream like a movie title, let's say 'Reservoir Dogs' And change something in it, to say 'Reservoir Cats' for example.

      Google search 'reservoir cats' and it will completely ignore what you actually search for in favor of the mainstream result. The effect is basically that you can't sesrch for 'reservoir cats'!

      Even putting something opposite or unrelated to the highly mainstream result will have no effect.

      Its completely entirely ridiculous and makes the search engine seem like a facade.

      • ohthehugemanate 4 years ago

        I love this game. Great idea!

        Side note: both duckduckgo and google gave me correct results for that specific search. Turns out "reservoir cats" is a movie and a simpsons episode.

      • dalmo3 4 years ago

        Although I'm familiar with your point, I literally just searched for the term you mentioned, navigated all the way to page 6 and every single result was specifically for Reservoir Cats proper, none of them even mentioned Reservoir Dogs in the title, only in the description for some of them.

        • bryanrasmussen 4 years ago

          the same, maybe someone at google read this and fixed it reaaalllly quick.

          If I search for Palp fiction it shows me pulp fiction results but asks if I really meant Palp fiction, if I say yeah I really meant that it shows me Palp fiction with a message did you really mean pulp fiction.

          on edit: some of the palp fiction is headline palp friction.

      • leucineleprec0n 4 years ago

        I’ve noticed this but only recently began to feel like the behavior was different, I wonder how strong the correction is now relative to the past

      • raytracer 4 years ago

        Reservoir Cats is actually a movie! I feel like I've slipped into an alternative time stream.

      • darwingr 4 years ago

        There is an actual term defined for this. Not search high-jacking...I can't remember.

    • donkeybeer 4 years ago

      In cases like that, it often ignores words even after double quoting them.

    • TheSpiceIsLife 4 years ago

      We ignored your search query and showed you results our highest paying customers / advertisers paid us to show you instead.

      I don't know of a good general internet search engine, so I tend to stick to the sites I know will provide answers that'll work for me, which is a shame for discovering new content.

    • pverghese 4 years ago

      I searched diving suit thickness and it provides to correct information as the first result...the diving suit thickness for different temperatures. Not sure why you are not getting that information

    • Guidii 4 years ago

      Odd. When I try that search[1] I'm seeing good results. There's a onebox telling me how thick a suit I need for different temperatures, followed by a bunch of articles on the topic.

      [1] https://www.google.com/search?q=diving+suit+thickness&rlz=1C...

  • yojo 4 years ago

    This exactly. I’ve been researching specific house repair issues and just get nothing but content spam. Whenever I want specific information I find myself adding “reddit” to the query string, which will usually turn up a thread with links out to the actual answer.

    • zadler 4 years ago

      Said it before and I’ll say it again, when Reddit finally becomes inaccesible via searches we will have lost a huge and very useful database of succinct information.

      • nullc 4 years ago

        You haven't noticed that reddit has become substantially search inaccessible a number of months back?

        Every reddit page while not logged in is full of hidden content from other unrelated pages. When you search, you'll get hits in these unrelated pages-- but when you follow the link it's not there (because it's on the unrelated pages).

        Worse, the pages with the correct content aren't necessarily in the results at all because it was low enough in the thread that it was collapsed and wasn't visible to the search indexer.

        It's not a total loss, but I'd say about 80% of my own comments are now difficult-to-impossible to find via search when they were easy previously.

      • tesseract 4 years ago

        Probably even more information of that nature is hidden in Facebook Groups where it was never searchable in the first place.

        • olyjohn 4 years ago

          Millions of useful photos have disappeared off of forums, now that Photobucket is dead.

  • jhoechtl 4 years ago

    Searching in Google has become all about shopping. Pure and relevant content is hard to find.

    Even today here are bloggers outside who do not have a commercial affiliation with the goods/items/things they are blogging about. Such content is practically impossible to find in comparison to all the Amazon-affiliated pseudo-information conveying spoof-sites.

  • mdolon 4 years ago

    I wrote a blog post complaining about this early last year: http://mdolon.com/essays/amazon-has-ruined-search-and-google...

    The Amazon affiliate program is definitely contributing to this problem.

  • pjmlp 4 years ago

    Same here, I no longer can find anything sensible on Google, regardless how much I try to customize the search expression.

    Additionally as polyglot it is very irritanting that Google tries to helpfully translate queries for me, thus I have to go to other search engines to actually find the article on the language I want.

  • alfiedotwtf 4 years ago

    I'm just sick of seeing pintrest and quora as the top 8 results :/

    • mahkeiro 4 years ago

      Pinterest is the worst as you cannot see the results without registering… How can it be a relevant search result! Fortunately -site:pinterest.com make it useable.

      • bni 4 years ago

        Google should remove all Pinterest results, It makes especially image search a pain.

        It is spam pure and simple.

  • gomox 4 years ago

    I couldn't agree more. More and more lately I've felt like the Altavista days. I know the information I'm looking for is out there, it's just not in the Google results page, which is plastered with unreadable stuff (paywalls, content farms), crap "content cards" in the results page, and sneakier and sneakier ads.

    I'm not sure what the beginning of the end was for Google Search, but I think the day where they changed the ad background to white is a good candidate.

    Google Search used to be like Chrome or Gmail - we know its wrong in the long term, but it's hard to stop using it because it just works so well.

    But these days, not anymore. Search is a lot less sticky, and it is their golden goose they are messing with here.

  • YeBanKo 4 years ago

    I have been struggling with the same issue recently. Results are much more narrower and they seemed to be leaning towards consumer goods items. Though I don’t remember when I ever bought something coming from Google search.

  • juskrey 4 years ago

    Simply, Google have lost the battle against SEO long ago, and, being in a trap of own cash flow, can't do anything radical to change that.

mmaunder 4 years ago

Catch22 though. If you eliminate bounce back, you have to rank to get the ranking signal into Google. So how did they rank in the first place? I haven’t tried to reverse what they’re doing but I don’t think the author quite figured it out. Interesting phenomenon though.

nolito 4 years ago

According to DK-hostmaster (https://www.dk-hostmaster.dk/da/find-domaenenavn) its registered to Ance Dzerina. Ieriku iela 37, dz. 32, LV-1084 Riga, Letland

At 2. juli 2021

Thats pretty fast to work so well. But i see lots of this, with other domains, when searching and have done for years so nothing new here i think.

Matsta 4 years ago

I had a look at this, and it looks to me like it's a 301 from another domain. Typically when domains get a manual penalty (primarily for spam), they drop in rankings overnight. So to counter this, you register a new domain and redirect it and overnight, your rankings bounce back. This technique is super common for blackhat sites like illegal streaming sites.

If the redirect is done as a meta refresh, then you can block it in your robots.txt from being picked up from SEO tools like Ahrefs, SEMRush etc.

These types of sites are called doorway pages and have been around for ages. They are most popular in Russia and on Yandex, but you do see them on Google for super longtail keywords with 0 competition.

The other important thing to remember is that doing SEO in any language that's not English is a walk in the park. Lots of SEO influencer types have case studies showing how much extra traffic they get by translating their content. [1]

[1] https://neilpatel.com/blog/seo-trend/

belter 4 years ago

The mermaid mentioned in the article seems to be either a terribly amateurish operation or a very sophisticated sting.

They can be easily traced to a block of flats in Latvia but since their registered phone its a Toy Store in Riga...I am going to go with probably stolen identify operation and a sense of humour on their part instead of the real operation of some 12 year in Riga...

agency 4 years ago

This is only tangentially related but has anyone else started getting more obviously spam emails in their gmail inbox lately? I feel like for a long time I never got spam in my inbox but lately I’ll get ones that seem like they should be easy to detect, talking about gifts and stuff and uSiNg wEirD capitals or s p a c i n g. Is it just me?

  • sp332 4 years ago

    Yes, and more non-spam email is getting filtered as spam. Also, a mailing list I was unable to unsubscribe from and marked as spam at least 5 times kept being delivered to my inbox.

  • beart 4 years ago

    I'll chime in as well. I forward everything from gmail to another account I have. I pretty much never got any forwarded email for years because the gmail account is only really used as an identity for google services. A few months ago I suddenly started to get a significant amount of spam forwarded for no known reason.

  • philiplu 4 years ago

    Not just you. Something changed two or three months ago. Never really saw spam for years before that; now 3 or 4 mails a day.

  • javier2 4 years ago

    Yes, a few days I've even had 5 different spam emails in the inbox.

kostecki 4 years ago

Interesting that Latvians picked a danish domain for norwegian content. Especially since you can't just hide behind domain privacy protection.

ocdtrekkie 4 years ago

My guess is they get away with it because it's a non-English query and most of the people working on these problems aren't looking at their localization. A big issue in general for global tech companies is that they don't usually handle things outside the US/English context particularly well. This often crops up in that political space, where for instance, something contentious like gun sales might get pulled from Google globally even though the political concern with them is mostly limited to the US.

An SEO-fighting Googler might at a glance have no reason not to think that could be a really relevant or popular site in your country.

rapind 4 years ago

> I think that Google uses stats on whether the user continued checking more results for that specific search query to determine if the visited result answered the user.

God I hope not. If Google does do this, it sounds like a really dumb idea, which will ultimately create widespread usability issues. I can already envision SEO consultants recommending this for their clients if this is believed.

Doesn’t look like it according to https://www.seroundtable.com/google-browser-back-button-rank...

evolve2k 4 years ago

Before I accessed the article I was hopeful from the title that “The Mermaid” was some hot new search engine out of Norway.

franze 4 years ago

In a similar note: https://www.autosuggest.net/ currently approaching a lot of websites in the german market.

"We help you to receive high-quality visitors from search engines, generate conversions and build your brand. To achieve these results, we ensure your website / company is recommended for specific keywords by the search engine's autocomplete function."

pope_meat 4 years ago

Gotta give it to these folks, good hustle.

gonab 4 years ago

Google has a problem when HN becomes an issue tracker

zulrah 4 years ago

I've noticed another trend recently where it seems that some websites write content for google SEO instead of optimized for human readability. E.g.: I've seen my exact search phrase repeated mutiple times and then a very long article about the topic when what I searched was a simple question with a few words answer.

kristofferR 4 years ago

Yeah, I experienced this same spam domain for some searches I did yesterday. It's everywhere.

knolax 4 years ago

More reasons why a global search monopoly is suboptimal. Smaller markets like this are just going to get neglected and maintained just enough that a better alternative can't compete. Google search is basically useless for any language other than English.

  • aembleton 4 years ago

    Surprisingly no one has created another search engine that targets another language other than yandex

yfkar 4 years ago

I've lately noticed that searching Google for topics related to gardening in Finnish often gives me some scraped and machine translated pages from Russia. Really annoying that totally useless content is so high up in the results.

tvirosi 4 years ago

Google search seems to have gotten significantly worse lately (sometimes to the point that it's barely usable). From scams like these (I've seen others) somehow getting a foothold, to a lot of internal "unbiasing" skewing the results towards googles political stance (usually totally irrelevant to my query). It's gotten to the point that I barely google anymore other than for things I already know what the results will be.

Crazyontap 4 years ago

Can somebody else who is in Norway can confirm this? It could be simply be a malware injecting this. Would be great to eliminate this possibility

  • intarga 4 years ago

    I’m in Norway, and I tried the first search “rema 1000” without getting any spam results on the first two pages…

    That doesn’t entirely eliminate the other possibilities though, google search isn’t deterministic, and the domain could have been reported since the article went up.

    • NorwegianDude 4 years ago

      Searches like "REMA 1000"(just a very well known brand name) seems to be the best case scenario, even according to the article(page 5).

      I've noticed that the ranking of the results changes really often.

  • sleepyhead 4 years ago

    It's not showing for the example search (Rema 1000) for me right now, but I did a search yesterday, about a person/company and the result was news related content, and ended up with a site with the same image. However I can't find havfruen (mermaid) in my browser history so they must use other domains as well.

  • knidoyl 4 years ago

    I'm in France and shearched for the how often thing, it returned themermaid on second page

  • probably_wrong 4 years ago

    I tried four of the queries from Germany using a private window. 3 returned results from themermaid on the first page.

    In particular, the only results ranking higher than themermaid for "hvor ofte oppdaterer apple ios" are those coming from support.apple.com.

  • javier2 4 years ago

    It is not happening with the example from the article for me, but I have seen this practice ruin my search results in varying degree over the past 6 months. Some times entire keywords will just be broken because there are so many fake sites.

  • Ueland 4 years ago

    Can confirm this is not malware, Google has a huge spam problem, see my previous comments.

fleddr 4 years ago

Makes you wonder what happens when AI can write "passing" articles. Useless to the reader, but too close to tell for the crawler.

nkozyra 4 years ago

> The simple solution would be to test sites regularly with an unknown IP and common user agent to check that a site isn’t just showing content to Google and gives real users something completely different. That would stop this.

Surely Google does this, right? Given that - in theory - showing different content to Google versus non-Google should result in a penalty, anyway ...

  • not2b 4 years ago

    The problem is that paywall sites already do this: Google sees the article, others see a paywall.

qwerty456127 4 years ago

For every country/market somebody should better make a search engine to compete with Google. Now this is a chance for Norway.

  • matsemann 4 years ago

    Used to have https://www.kvasir.no/ but now it's just a skinned Google.

    sesam.no (not valid domain anymore) was an engine made by some a big Norwegian company back in 2005 or so.

    Norway used to be big in search. FAST got acquired by MS back in 2008.

  • sleepyhead 4 years ago

    We had a fast one but Microsoft bought it and shut it down.

cnxsoft 4 years ago

Google is garbage. I once complained a website stealing my contents and other people's contents was ranking very highly in Google. I was told I'd better fixed my website before looking at "competitors". Part of that was true, but at the time the person did not seem to care at all of spammy content delivered by Google.

StreamBright 4 years ago

Same in Hungarian. Google is full of spam and nobody cares. The top hits are auto-translated garbage for many searches.

qwerty456127 4 years ago

I'm surprised to find out people actually return to the search results page using the back button. Whenever I am serious enough (enough to keep looking after the first link I click does not satisfy me) about finding something I always Middle-Click or Ctrl+Click the links to open them in new tabs.

algismo 4 years ago

Just tried Google.no from my computer (Norwegian IP (Larvik area)). Nothing similar. I see “normal” search results. In any way, I stopped using Google stuff 5 years ago. Never looked back since then, so my search history is kind of clean, maybe that changes their algorithm behavior.

Recommend to switch to DuckDuckGo:)

golergka 4 years ago

This image features characters from Smeshariki animation series, hugely popular in Russia in the last 15 years.

tikiman163 4 years ago

I'm kind of curious why he's so concerned about this? They've never managed better than ninth most relevant and in most cases they didn't even make the first page of result. Any advertising person will tell you, if you aren't in the top 3 results (basically the top result now that paid ads automatically get the top 2 spots on nearly all searches) your odds of being seen and clicked on drop to almost nothing.

Are they potentially doin harm? Sure. Have the successfully managed to trick anybody with this? I'd be extremely surprised if they're getting more than a dozen people clicking through from being the ninth result in a day,and when people see they've been redirected to an advertisement the majority of people immediately click away.

This isn't like clicking on a fake prorn site that redirects to cam girls with viruses hidden in all the downloads. It's random unrelated searches redirecting you to blatant ads for cryto currency. The kind of people who are young enough to know what crypto currency is and how to buy it, also know how to spot a redirect to a fake website.

  • burnished 4 years ago

    These kinds of scams are a stochastic process. They don't work on your average person, they only work on vulnerable people. Heres the catch though, everyone is vulnerable at some point in their lives. This is where the stochastic process comes in, they don't need to get you when you're strong, they just need to test enough people enough times to catch them in a vulnerable moment.

onepunchedman 4 years ago

The language in those scam articles is actually perfect, first time I've seen that.

sublimefire 4 years ago

It is interesting as you cannot see the content which is being indexed. Suspect only bot does. If I understand correctly this is the sequence of events from the bot's perspective:

## read robots.txt `curl 'https://havfruen4220.dk/robots.txt'`

## use pointer to a sitemap.xml

curl -A 'Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)' 'https://havfruen4220.dk/sitemap-no.xml' > sitemap.xml

## read more sitemaps

curl -A 'Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/W.X.Y.Z Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)' 'https://havfruen4220.dk/sitemap-no-1.xml' > sitemap1.xml

Other sitemaps contain a pointer to a "webpage" eg: https://havfruen4220.dk/no/7a28855e4714dd14

## read web pages

Each location in a sitemap has a "lastmod" of today/yesterday so bot returns there everyday. In addition each webpage has a "<meta name="robots" content="noarchive">"

But if you visit each of those pages then it shows you a cartoon image. It seems the actual indexed content is visible only to the bot.

## But how is actual content being rendered?

The question is, what conditions (request params/headers) result in the actual content being rendered? The bot needs to evaluate it. Suspect it is some combo of checking if the requester is an actual google bot, maybe by looking up the IP https://developers.google.com/search/docs/advanced/crawling/...

cratermoon 4 years ago

The Norwegian pinterest

tapland 4 years ago

I imagine it's done in a similar way to how reddit circumvents searching for results from certain dates. I don't like anyone messing with google results.

techaddict009 4 years ago

Someone has probably found some kind of SEO Hack or Some 0 Day in Google serp. There are plenty of .it domains doing similar in Google USA serps.

fergie 4 years ago

Norwegian here- I haven't seen this at all- maybe the author has been somehow "fingerprinted" and targeted?

siproprio 4 years ago

I've seen this trend in other places too:

For example, Microsoft routinely deletes negative feedback from GitHub issue for vs code.

ubercore 4 years ago

FWIW, I just tried these searches (am in Norway) and didn't see that domain in the results.

onepunchedman 4 years ago

Wow, the Norwegian on those scam web sites is actually perfect. Never seen that before.

  • Ueland 4 years ago

    That's because it's real content that they have stolen and just republished. In SEO circles one like to say that original content is king. Well, not so much after all.

punnerud 4 years ago

I live in Norway and don’t have this problem now. I had a similar problem about a year ago on my MacBook Air because of some software that altered my Google results in all of my browsers. Don’t remember the name of it, but something smelled fishy when the results was different from the ones on my phone.

  • oarthOP 4 years ago

    Pretty sure affects you too as it's the same for me, on multiple networks, multiple user-agents, multiple devices and so on.

    Simply just trie one of the examples like "hvordan regne ut prosent"(how to calculate percentages) or, I don't know..."DNB aksje"(DNB stocks, DNB being the biggest bank in Norway). Sure enough, both ranks on the first page or as the one of the top results. (One is now using the www.nem-varmepumper.dk domain, that is the same thing).

    EDIT: Now the DNB one moved from 2 and 3 place to page 2. Things are moving around quickly.

classified 4 years ago

If the mermaid took it, does that mean Google search is resting with the fishes?

manceraio 4 years ago

They will get probably outranked on the next big Google update.

claroclinic 4 years ago

Well this is happening in all countries

rataata_jr 4 years ago

Havfruen, brought to you by mountain trolls from Finmark.

mlang23 4 years ago

It seems google has lost its ability to block spam effectively. Since a few months, I notice an increase amount of outright scam being promoted on YT. I even got a ad for a fake Musk telling people to invest in a shady bitcoin scheme. Knowing that Google is willing to let these slip through just to maximize their ad revenue is really a warning sign that this company, no matter how large it might be by now, should not be trusted anymore.

chovybizzass 4 years ago

I've been using https://search.brave.com for a few weeks. Most of the time I find what I need.

Goety 4 years ago

I will remain steadfast in my support from Google forever and always.

jessaustin 4 years ago

TFA talks about Google testing with "unknown IP", but doesn't mention any testing done by the author with cookies cleared or in incognito mode. This seems basic.

  • finnh 4 years ago

    What do you expect incognito to change? That would presumably show the same content the author is seeing. Only Google sees the content that drives the ranking.

    It is Google that needs "incognito" mode, not the author.

    • jessaustin 4 years ago

      I stopped using google for search because I noticed the filter bubble it was building around me. Perhaps that wasn't maintained by cookies, but in that case I wonder what it was...

    • QuietCF 4 years ago

      For all we know (as the OP doesn’t mention trying incognito) the OP could have malicious software on their device that hijacks their browser to manipulate search results

      • feikname 4 years ago

        that's easily verifiable doing the searches yourself;

        a search for "hvor ofte bør man dusje" in my english google, conencting from brazil shows havfruen4220.dk as 6th and 7th result, which is pretty high for a spam website

        "hvordan regne prosent" shows 2 dk websites www.humanrebels.dk, and havfruen4220.dk as 9th and 10th results

        could be since the OP clicked on these links to find out google made his personal algorithm show even more of this stuff

        thus, I imagine a never ending cycle of even more spam could easily be generated, specially for an innocent user

      • aembleton 4 years ago

        Does incognito prevent malicious software from hijacking your browser?

londons_explore 4 years ago

It's the hooking of the browser back button in a way that Google does not detect which is the real 'trick'.

Anyone who can do that can rank as high as they like for any search query.

  • londons_explore 4 years ago

    To expand on this: A very strong ranking signal is how many of the users that click a search result are sufficiently satisfied with the information they have found to end their search.

    A good proxy for this is how many people don't click the 'back' button to see other results.

    Google is already aware of sites which hijack the back button. Their crawler detects this, and if they find it, they throw out the figures of how many people click the back button.

    So if you can find a way to hook the back button so nobody can click back, while stopping google thinking you have hooked the back button, then your page will keep creeping up the rankings.

    Google detects back button hijacking with their crawler (by rendering the page in Chromium and seeing the effect when hitting the actual back button), but this is circumvented by presenting the crawler different html. (or making sure the page behaves differently in their crawler, potentially by checking things like the model of the graphics card - googles crawlers don't yet support most of WebGL 2.0, and also simulate playing audio at the wrong rate)

    Google also detects how many real users click back. If it's zero, then thats a warning flag. So I'd guess the back-hijacking logic is only activated ~80% of the time.

paxys 4 years ago

I doubt it's some crazy sophisticated SEO hijacking operation. Probably a result of a small data set (Norwegian language web pages), specific search terms (Norwegian brands, companies), and lots of keyword stuffing. Most of the examples the author pointed out were from pages 5-10 of Google results, which are probably worthless for ad revenue anyways.

  • Osiris 4 years ago

    He specifically pointed out that it's ranking in the top 10 for nearly every search he did.

  • tyingq 4 years ago

    It does rate a pretty good chuckle recalling old Google blog posts about their various uber-sophisticated anti-spam ML algorithms and how black hat SEO just wasn't possible anymore.

  • rchaud 4 years ago

    This type of scraped-content websites were common for English language searches back in 2010 or so. I believe the 'Panda' algorithm update eliminated them from English searches.

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection