FAQ / Why Algolia / The impact of Algolia on SEO
Mar. 28, 2019

The impact of Algolia on SEO

If implemented well, Algolia doesn’t have any direct positive nor negative impact on SEO. But there are multiple small indirect positive impacts, due to the fact that search engines prefer websites with a good UX.

There are things to take into account when implementing anything through the front-end as we recommend on JavaScript and with any search provider. Checkout our SEO Tutorial here.

Many duplicate pages

By indexing search results from any provider, not just Algolia, you can have duplicate content due to indexing of faceted pages that show the similar results.

This can lead to a situation where link equity is spread across pages, and no one page has enough to rank highly in the search result pages. This is solved with the canonical tag. It tells the search engine crawlers, “Yes, this page is a duplicate, so give all of the link equity to this other page.”

Google also doesn’t recommend indexing search results pages. They reserve the right to penalize these pages and many SEO experts recommend disallowing these pages via your robots.txt and noindex’ing them. By letting the search engines crawl lower value pages like search results, you spend up your crawl budget and it might in turn not be allocated to more important pages.

Concerns About JavaScript Delivered Content

At Algolia, we recommend all search results come through the front-end, served by JavaScript. We recommend this because it is much better for the user experience and happy users are returning users.

In the past, there has been a big concern that web crawlers are unable to see this content. But in 2015, Google announced that they indeed can see it. It is important to note, however, that Google still recommends having individual URLs for the “pages” that the JavaScript response creates.

To do this, you will use the browser APIs that allow you to manipulate the browser history. This is also a good UX principle independent of the SEO benefits (which is generally the case!). Note that we do this automatically with our InstantSearch.js library if you enable URL sync.

However, as the URL above mentions, if you want to be 100% safe then you can pre-render pages on the backend and serve it up to the search engine crawlers. This will increasingly be unnecessary as the web crawlers get better at executing JavaScript.

SEO Best Practices

Here are a few best practices and reasons why an Algolia search can help improve your ranking in search engines.

  • Make sure all the content of your site that is indexed by Google is also indexed to Algolia. We recommend doing a backend search implementation for the initial rendering of your search results because crawlers are better at parsing HTML pages than executing JS code. After the initial load, it’s better to use a JS front end (InstantSearch for example). This is the best compromise between having a good SEO and an amazing search experience.

  • You want people to be happy when visiting your website, you want them to find what they are looking for. Google will recommend sites where people are spending more time. If Google recommends a site and you don’t find what you’re looking for on this site, you will go directly back to Google to search again. This back-and-forth will flag Google that people are not engaged on your site. Having a fast and relevant search such as Algolia will help keep people on your site!

  • We give you analytics to find out which words people are using and what they are looking for on your site. Most of the time, the words people look for on your site are the same as their Google queries. You can use analytics to figure out the words people are using to update keywords on your pages accordingly.

Did you find this page helpful?