Amazon Elastic Load Balancing & SEO Considerations

Oct 5, 2011 • 8:11 am | comments (0) by twitter Google+ | Filed Under Google Search Engine Optimization
 

Amazon Web Services LogoI, like many others in the web development industry, are huge fans of cloud services from Amazon, RackSpace and others.

The question of does Google index content in the cloud has come up in the past. The question on how Google handles content delivery networks and CDN rankings has come up as well. I've even written a post on how to use Amazon S3 with SEO benefit.

That being said, there is a new post about another Amazon Web Service named Elastic Load Balancing where on user of the service is considered Google won't be able to figure out the site and the content.

The webmaster posted his concerns in a Google Webmaster Help thread asking:

I am looking to move our websites, 15 of them, to Amazon Cloud Services. I need to use a load balancer but their load balancers give you a public DNS name to use but change the public IP behind the scenes somewhat often. From what I understand the google bots take a harder look at you when you switch public IP(s) and because if Amazon uses dynamic IP(s) then I assume this could create trouble? Has anyone had any experience with using Amazon's ELB(s) or have any knowledge how Google will react with a Dynamic IP load balancing service?

JohnMu from Google answers his concerns saying it shouldn't be an issue but GoogleBot may act a bit slower to crawl and index content. Here is John's full explanation:

There are many reasons why a site might change IP addresses from time to time, and that's fine with us. We don't automatically assume that they're doing shady things :)

The main issue that can sometimes arise is that Googlebot might take a bit of time to figure out how fast to crawl your pages. If the IP address changes regularly (say outside of a small set of IP addresses), that could look like your site is moving servers each time. In cases like that, we may crawl a bit more conservatively than if we've figured out that your server is sturdy enough to be crawled at higher rates. That also depends on how your website is set up, and if there's really content there that would need to be crawled at such a rate (eg if this is for a blog that's mostly static with a few hundred pages, then the crawl rate will not be that important, at least compared to a large news publisher that publishes hundreds of new articles a day).

If you should see issues with regards to the crawl rate not being as high as you would need it, I'd recommend submitting feedback in Webmaster Tools (Settings / Crawl Rate / Learn More / Report a problem with Googlebot). The team regularly reviews that feedback, and will generally be able to tweak the crawl rate accordingly.

So keep this in mind when using the Amazon Elastic Load Balancing solution for your server configuration.

Forum discussion at Google Webmaster Help.

Previous story: Daily Search Forum Recap: October 4, 2011
 

Comments:

No comments.

blog comments powered by Disqus