Provide alternatives for Javascript-generated content
Avoid Javascript-generated content as this blocks crawlers from accessing your content.
Many modern websites utilize Javascript to generate content pages on the website.
This prevents or hampers many web crawlers from indexing your content.
Why is Javascript bad?
-
Most web crawlers do not process Javascript, or have very limited support for it. They will see what a user sees in a web browser when Javascript is disabled.
-
For web crawlers that do have Javascript support, the ability to crawl and index the content is still limited.
-
Web crawlers that have support for Javascript are much slower and often use a secondary queue to process any Javascript, meaning you might get pages that are indexed without Javascript appearing in your index, which are updated later when the Javascript is processed.
-
The web crawler might still be unable to successfully crawl your Javascript generated pages. The success depends a lot on the Javascript code itself. For example, if the Javascript is generating standard HTML you are probably OK. However, if your code is generating something custom like a clickable image or page that renders without using standard HTML tags then it is unlikely that the page can be indexed. This is because the crawler will have no way of knowing what the links are and what to 'click' on to follow links to subsequent page.
-
What should I do?
Try to use Javascript to enhance your page content, while still providing full access to your content when it is disabled. This will also have accessibility benefits for your website.
If you rely on Javascript-generated pages:
-
Provide a non-Javascript alternative (e.g. using the
<noscript>
tag) so that you can still get around your website when Javascript is disabled.