Recently there has been discussion whether Google has developed some kind of ‘TrustRank’ which is used to determine search engine positions. If this does indeed exist, then it shows that Google is determined to only give high search engine positions to websites that it deems to be trustworthy, i.e. do not partake in any blackhat/spam techniques.
Although there is no official confirmation from Google that this actually exists, they did just register the Trade Mark for “TrustRank”, and perhaps the introduction of the Sandbox is just an extension of the TrustRank implementation.
What Could Affect My Website’s TrustRank (TR)?
Domain Age – New websites currently get filtered (the Sandbox), so it seems feasible that the older a website gets, the more trust is placed in it.
Length of Domain Registration – By registering a domain for a longer period may show you have a long-term vision for the website.
Regular Updates – By adding content to your site regularly you show that the site is being continually updated and cared for.
Backlinks – The whole concept of the World Wide Web is that websites interlink with each other, but there are 3 different groups of backlinks: –
- Good Links: Good links are one-way links from websites that are in a similar theme to your own website and the links are inside a body of text.
- Not-So-Good Links: Not so good links might include reciprocal links, or backlinks from off-topic websites.
- Bad Links: Bad links are from websites that operate Blackhat* optimization techniques, or from websites that are banned.
* Blackhat is a term used to describe webmasters who purposefully use underhanded tactics to trick the search engines in giving them good rankings.
Unique IP – Having a unique IP address might show you care enough for your website to have purchased a unique IP address, and you also ensure your IP is not shared with any ‘bad neighbors’.
Sitemaps – Including a sitemap helps search engine spider your site. It also helps visitors find what they are looking for. It may also builds trust since you are considerate enough to help the viewer find what they are looking for. This isn’t about the ability to easily crawl, rather just offering the option to visitors shows you have their interests at heart. Note: Again, no proof, just an idea.
Reliable Uptime – Sites that have poor up-time lose trust.
Targeting KWs based on Traffic over Relevance – Every website is unique, and perhaps if you target the market that applies most to your website, rather than the one that will bring most traffic will increase your traffic ranking.
Spam – Using any kind of Blackhat technique could lose you trust, such as doorway pages, cloaking, duplicate content, pointing multiple domains to the same site, hidden text, keyword stuffing, comment spamming, pop-ups, and hidden links.
- Cloaking – Displaying different content to the search engines to what ‘regular’ visitors see.
- Duplicate content – Copying content from other websites and publishing it on your website.
- Doorway pages (AKA landing pages) – Individual pages that are optimized for one specific keyword.
- Keyword stuffing – Over using your targetted keyword / keyphrase on a webpage. This tactic is usually employed in doorway pages.
Going Off-Topic – Adding content to your website that is not related to your site theme might be seen as a way of spamming, and as a result hurt your TrustRanking.
Bad Topics – Producing a website that is commonly associated with spam/blackhat might reduce your TrustRank. Sites such as gambling, alcohol, anything illegal, hate speech, pornography, or anything that may get filtered by Google’s SafeSearch.
Too Many Terms in the Domain Name or URL – For example: www.these-are-all-of-my-keywords.com or www.domain.com/these-are-all-my-keywords.html. This hits more with keyword stuffed URLs.
Creating a Google Sitemap – People who create a Google sitemap are usually doing it for the purpose of search engine optimization. So can just creating one hurt your TrustRanking?
Meta – Can spamming the meta tags of a webpage damage your TrustRank?
Templates – Can Google recognize duplicate page layouts like they do with duplicate content?
Google bar Tracking – By using the Google toolbar it is possible to track how popular a website is by counting the number of hits it gets, not necessarily just from the search engines.
SERP (search engine result page) Tracking – Click-through rates, or monitoring if visitors back out too soon. If they do then the site gets devalued for the term someone accessed it for.
Get more stuff like this
in your inbox
Subscribe to our mailing list and get interesting stuff and updates to your email inbox.
Thank you for subscribing.
Something went wrong.