Settings

Theme

How we blocked TikTok's Bytespider bot and cut our bandwidth by 80%

nerdcrawler.com

34 points by chptung 2 years ago · 22 comments

Reader

gizmo686 2 years ago

I don't get what Bytedance is doing here. Clearly they are not actively trying to evade blocks, as they are idenifying their bot with a user agent sites can block.

However, surely they have enough smart engineers there to realize that running a bot at full speed (and, based on other reports, completely ignoring robots.txt) will get them blocked by a lot of sites.

If they just had a well behaved spider, almost no one would mind. Getting crawled is a fact of life on the internet, and most website owners recognize it as an essential cost of doing busses. Once you get a reputation as a bad spider, though, that is very hard to shake.

jd20 2 years ago

I didn't see it mentioned, but why not just use robots.txt? Does Bytespider ignore it?

chasd00 2 years ago

Is returning a 403 based on the user agent worth a blog post? Also, can't Bytespider just change their user agent to Byte-Spider? Or, just make their user agent a random string? It will be a forever arms race and require constant code updates to keep chasing that bot by user agent. You're probably better off whitelisting the known user agents and blocking everything else.

Also, does it really require a specific "gem"? This is HTTP request filtering, the router (as in the real router, like the metal box with network cables) can probably do it by itself these days.

  • phartenfeller 2 years ago

    For me the interesting part is that the crawler is going bezerk. Never ever should a single crawler be the cause of 80% of traffic.

    Also why should they not respect the 403? Crawlers just go to anything they can find. It is not a targeted attack.

  • chptungOP 2 years ago

    It might not be, but I couldn't find much about the topic so I figured I'd write it up and share. And you're right that this may be a bit of whack-a-mole, but for now I've cut my bandwidth down which means I may be able to downgrade my cloudinary plan to a lower tier, which is a big win for me since it accounts for like 20-30% of my total operating cost

braden_e 2 years ago

This is the worst behaved bot I have ever seen, I suspect it is AI related. I recently decided to block all the AI crawlers - unlike search engines I get nothing from them.

  • chptungOP 2 years ago

    Same! Any chance you can share a list of the bots you're blocking so I can add them too?

    • braden_e 2 years ago

      ClaudeBot (second worst crawler) and GPTBot are the only ones that identify themselves in an obvious way. The rest I am blocking by network. I assume AI when the crawler is very aggressively downloading images - it must be costing them an absolute fortune!

mmaunder 2 years ago

Is it just me or is that site a bit broken? Weirdly dark.

Edit: Nice try on the vote brigade guys. lol

  • chptungOP 2 years ago

    Yeah...I suck at optimizing for dark mode and I think I'm about to get too much traffic from this post so I can't fix it right now. Probably a tomorrow task haha

catoc 2 years ago

Can large companies not be faulted for ignoring robots.txt? Seems like something GDPR could enforce for personal(ly owned) sites?

Keyboard Shortcuts

j
Next item
k
Previous item
o / Enter
Open selected item
?
Show this help
Esc
Close modal / clear selection