Robots.txt Won't Save You
rb.ax> This matters because in an ideal world all client requests would have the correct stated User-Agent so that website administrators could manage appropriate access.
The reason that it's necessary to spoof the user-agent is that so many websites mishandle it to begin with. Websites in general brought this state of things on themselves.