Google’s John Mueller has provided valuable insights for site owners on how to effectively manage outdated AMP subdomains. His recommendations aim to assist webmasters in making informed decisions regarding legacy AMP content, which may no longer serve its original purpose.
Mueller suggests two main approaches: the first is to implement 301 redirects from the AMP subdomain to the corresponding pages on the main domain. This method ensures that any existing traffic to the AMP pages is seamlessly redirected to the standard versions, preserving user experience and maintaining search engine ranking. The second option is to remove the AMP subdomain entirely from the DNS settings, which eliminates any potential confusion regarding the availability of AMP versions of the site.
For websites with extensive content, particularly those boasting large page counts—like those with around 500,000 pages—the impact of legacy AMP URLs on the site’s crawl budget is not a major concern. In fact, Mueller points out that such sites are unlikely to experience significant drawbacks from retaining outdated AMP URLs.
Another important aspect to consider is that AMP subdomains have their own separate crawl budget, distinct from that of the main domain. This means that managing legacy AMP URLs may not heavily interfere with the crawling and indexing of the primary site. By understanding this separation, site owners can better strategise their approach to legacy content without negatively affecting their overall SEO performance.
Google Search Advocate John Mueller has provided important guidance on managing outdated AMP subdomains and the implications they have for crawl budgets. This advice addresses a common concern for website owners, especially those overseeing large-scale sites.
Recently, a website owner with around 500,000 URLs raised a question on Reddit regarding the ongoing crawling of an abandoned AMP subdomain. Despite implementing 301 redirects to point users to the appropriate pages three years ago, the AMP subdomain continues to attract crawling activity. This situation points to the persistent challenges that many websites face when transitioning from previously adopted AMP solutions to newer frameworks.
Mueller’s insights are particularly relevant in today’s digital landscape, where efficient site management is crucial for maintaining performance and user experience. He emphasized the need for website owners to be proactive in addressing legacy AMP setups. This includes not only maintaining proper redirects but also considering whether to completely remove the AMP subdomain from DNS if it is no longer in use.
The challenges of transitioning from AMP highlight the importance of having a clear strategy when making such significant changes. Website owners must understand that simply implementing redirects may not be enough to halt all crawling activity from legacy subdomains. As the digital environment continues to evolve, staying informed and adaptable will be key to effectively managing website performance and ensuring optimal search engine visibility.
Key Details
The site owner has raised a significant issue regarding the continued crawling of hundreds of thousands of AMP URLs by Googlebot, despite taking various steps to eliminate them from search engine visibility.
Three years ago, the owner implemented 301 redirects for all AMP URLs, redirecting traffic directly to the main domain in an effort to phase out the AMP subdomain. Furthermore, the AMP sitemap file was intentionally left empty to signal to Google that these URLs should no longer be considered.
Compounding the problem, both the HTTP and HTTPS versions of the site are experiencing this excessive crawling, suggesting that Googlebot is not fully recognising the changes made. This situation highlights the complexities involved in transitioning away from previously implemented AMP solutions, as well as the difficulties website owners face in ensuring that their actions are effectively communicated to search engines.
The ongoing crawling of these outdated AMP URLs not only raises concerns about potential crawl budget issues but also poses challenges for site owners trying to maintain an optimised presence in search results. It underscores the need for clear guidance from search engine representatives, such as John Mueller, to help site owners navigate the intricacies of managing legacy subdomains and ensuring that their site architecture aligns with current best practices.
Mueller’s Response
John Mueller provided two clear recommendations for managing legacy AMP subdomains:
- Keep the existing redirect setup.
- Remove the subdomain’s hostname from the DNS entirely.
Mueller also addressed worries regarding crawl budget, suggesting that they may be unfounded. He explained, “If it’s a separate subdomain, it’ll generally have its own crawl budget. Also, with 500,000 pages, I don’t think you would need to be overly concerned about crawl budget, even if it were not on a separate hostname.” This indicates that, for larger websites, the impact on crawl efficiency is likely minimal, allowing site owners to focus on more pressing concerns.
Context
This guidance arrives as numerous publishers reassess their strategies for implementing AMP.
Mueller’s advice indicates that website owners do not necessarily need to resort to complicated technical fixes when dealing with deprecated AMP content.
For those encountering similar issues, he recommended referring to Google’s “Large Site Owner’s Guide to Managing Your Crawl Budget” for further insights and strategies. This resource can provide additional support in optimising crawl efficiency and handling legacy content effectively.
Analysis
Mueller’s response highlights that crawl budget optimisation is indeed a relevant concern for large websites. This is particularly important for publishers who manage extensive content libraries and need to ensure their most valuable pages are crawled and indexed efficiently. In such cases, understanding how Googlebot allocates its crawling resources can help site owners maximise their visibility in search results.
On the other hand, for mid-sized sites that contain around 500,000 pages, the situation is different. Mueller suggests that the crawl budget may not be a pressing issue for these sites. While it’s still beneficial to have a grasp on how crawling works, the sheer volume of pages may not necessitate the same level of scrutiny as it would for larger domains. This means that site owners can focus on other areas of optimisation without the constant worry of crawl budget constraints.
Ultimately, the insights from Mueller provide clarity for website owners of various sizes. While large publishers must actively manage their crawl budgets to maintain search performance, those with a more moderate number of pages can prioritise different strategies. Understanding these distinctions can lead to more effective resource allocation and a more tailored approach to SEO.
Next Steps
This guidance offers clarity for technical SEO professionals who are navigating the changes related to AMP implementations.
When managing outdated AMP setups, there are a few options available:
- You can maintain the existing 301 redirects to direct traffic effectively.
- Alternatively, you may consider implementing a DNS-level solution to remove the AMP subdomain entirely.
- For more tailored advice, consulting Google’s documentation on crawl budgets can provide useful insights for specific scenarios.
More Digital Marketing BLOGS here:
Local SEO 2024 – How To Get More Local Business Calls
3 Strategies To Grow Your Business
Is Google Effective for Lead Generation?
How To Get More Customers On Facebook Without Spending Money
How Do I Get Clients Fast On Facebook?
How Do You Use Retargeting In Marketing?
How To Get Clients From Facebook Groups
What Is The Best Way To Generate Leads On Facebook?
How Do I Get Leads From A Facebook Group?
How To Generate Leads On Facebook For FREE