Google’s Preferred Sources tool, created to give users more control over which websites appear in Top Stories, is increasingly showing spammy and low-quality domains. Instead of highlighting only reputable sites, the tool has surfaced copycat websites, domain squatters, and even parked domains, some of which barely have content beyond their homepage.
The Preferred Sources feature is designed to allow users to personalise their news feeds. By selecting their favourite outlets, users can ensure these sites appear more frequently in Top Stories, offering a tailored experience beyond Google’s standard ranking algorithm.
This personalisation does not prevent other news sources from appearing; it merely reflects the preferences of the user. The tool was intended to benefit legitimate, high-quality sites by giving them more visibility. However, recent findings suggest it is being exploited in ways that undermine its purpose.
A key issue involves domain squatters who register domains that closely mimic popular websites. These individuals often use a different top-level domain (TLD), such as replacing .com with .com.in or .net.in, effectively creating a copycat site that appears almost identical to the original.
For example, when a well-known domain is registered with a .com or .net TLD, squatters may register the same name with an Indian TLD. These cloned domains can then show up in Google’s Preferred Sources tool, confusing users and diverting traffic from legitimate websites.
It remains unclear whether these domains are being added to the tool manually by their owners or if Google is automatically picking them up. Either way, the result is the same: low-quality or irrelevant sites appear alongside recognised news outlets.
A practical example can be seen when searching for a popular SEO tool within the Preferred Sources feature. The genuine domain is displayed, but a parked domain using the Indian .com.in TLD also appears. This demonstrates the tool’s vulnerability to spam and copycat sites.
The tool’s availability in the USA and India may partly explain why these Indian domains are appearing. It appears that the system currently lacks a mechanism to verify the authenticity or content quality of domains submitted or indexed.
Other high-profile websites have also been affected. A search for HuffPost within the tool returned a copycat site on an Indian domain, which featured unrelated content such as payday loans, personal injury lawyers, and luxury products, rather than legitimate HuffPost articles.
Similarly, a site mimicking the Search Engine Journal domain surfaced in the Preferred Sources tool. This domain offered little original content but still appeared in the user interface, highlighting a broader problem with the tool’s quality control.
Many of these spammy sites are indexed minimally, often only including the homepage. This suggests that Google’s indexing and verification processes are not sufficiently rigorous to prevent low-quality or misleading sites from being recommended.
The presence of these copycat domains raises questions about how SEOs might be exploiting the system. Some may register these domains intentionally to gain visibility or traffic, taking advantage of the tool’s lack of scrutiny.
From Google’s perspective, the tool is meant to empower users and help reputable sites gain exposure. However, without safeguards to prevent spam or verify site legitimacy, the feature risks being overshadowed by low-quality and misleading websites.
This situation emphasises the challenge of maintaining quality in automated and semi-automated tools. Even features designed to enhance user experience and promote trusted sources can be exploited if the platform does not actively monitor submissions.
Until Google introduces stronger controls or verification measures, users may continue to encounter copycat domains and spammy sites in Preferred Sources. For legitimate websites, this reduces the value of the tool and may dilute the intended benefit of enhanced visibility in Top Stories.
In summary, while Google’s Preferred Sources tool has the potential to improve personalisation and help high-quality websites gain exposure, its current vulnerability to spam and domain squatters undermines its effectiveness. Both users and publishers will need to monitor the tool closely until Google implements improvements.
More Digital Marketing BLOGS here:
Local SEO 2024 – How To Get More Local Business Calls
3 Strategies To Grow Your Business
Is Google Effective for Lead Generation?
How To Get More Customers On Facebook Without Spending Money
How Do I Get Clients Fast On Facebook?
How Do You Use Retargeting In Marketing?
How To Get Clients From Facebook Groups