Robots.txt files are cached for 60 minutes. Use the flush setting to force an update.

Enter a URL and a token to check if it's disallowed in the robots.txt.