[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["没有我需要的信息","missingTheInformationINeed","thumb-down"],["太复杂/步骤太多","tooComplicatedTooManySteps","thumb-down"],["内容需要更新","outOfDate","thumb-down"],["翻译问题","translationIssue","thumb-down"],["示例/代码问题","samplesCodeIssue","thumb-down"],["其他","otherDown","thumb-down"]],[],[[["Some websites and CDNs are incorrectly using `4xx` client errors (except `429`) to limit Googlebot's crawl rate, which is detrimental."],["Using `4xx` errors for rate limiting can lead to content removal from Google Search and unintended exposure of disallowed content."],["Google provides clear documentation and tools to manage Googlebot's crawl rate effectively through Search Console or by returning appropriate HTTP status codes like `500`, `503`, or `429`."],["The correct way to manage crawl rate involves understanding HTTP status codes and using Google's recommended methods to avoid negative impacts on search visibility."],["For further assistance or clarification, website owners can reach out through Google's support channels such as Twitter or the help forums."]]],["Website owners should avoid using `4xx` client errors (except `429`) to manage Googlebot's crawl rate. These errors indicate client-side issues, not server overload. Using `4xx` codes (excluding `429`) can lead to content removal from Google Search, and if applied to `robots.txt`, it will be ignored. Instead, employ Search Console for rate adjustments or utilize `500`, `503`, or `429` status codes to signal server overload and manage crawl rates effectively.\n"]]