The saddest day for a webmaster is the day you wake up and find that your website isn’t showing anymore in Google… that you have been a victim of a Google Ban.
This is the worst thing that can happen to your website, especially if you are running a business or a blog through your site.
How can I find out if my site is banned by Google?
To know this, use this code:
site: www.yoursite.com
(replace www.yoursite.com with the domain name of the site you think may be banned)
And, if you see no results on your site then maybe, it is banned by Google, or the robots.txt file is bad, or you haven’t submitted your site to Google.
See that your Robots.txt file doesn’t contain the code below:
<
p style=”text-align: left;padding-left: 30px”>User-agent: *
Disallow: /
<
p style=”text-align: left”>Don’t use “Disallow: /” just replace “/” with the directory (Eg: “Disallow: /Personal” ) you want Google to not index.
If you follow this, your site will be again listed in Google, if not, it may be banned.
Here is a list of ways by which you can keep your website in Google’s good books:
- Up-time: Make sure you have the best up-time possible for your website. This means you must have reliable web hosting. If Google visits your website and it is not up, you may risk Google banning your site.
- Spam: It is the thing which no search engine loves. So don’t use spamming techniques to promote your website. This will almost definitely lead to condemnation of your website for at least 6 months. Never use “black hat” methods like doorways, hidden text or cloaking. You are only setting yourself up for disaster.
- Plagiarism: Make sure your content is original and relevant. Websites that duplicate content are punished with lower rankings and even bans. Your content should always be made for human consumption and relate to your website’s theme.
- Linking: Linking is a two way street. Not only should the links going out of your website be quality, but be certain that the links that lead to your website are from Google respected locations as well. Avoid link farms and any paid linking service. Also, excessive linking is a red flag for Google. When you add internal links on your own sites, make sure you do it in moderation and with purposeful intent.
- Cleanliness: Be sure your site is Google friendly. Build your website so that Google is able to index every page. Include a sitemap. Be sure all your pages are working and that there are no broken links.
Don’t try to fool Google with tricky tactics. The price you will pay is far worse than the effort it takes to do things the right way!
(Source: My Brain and my friend “Brad”)
(Image Credits: Kranthi)
3 Comments
I’ve heard about the robot.txt file before and I never use it till now.
For me it just the way to tell which page you don’t want the SE spider to index.
yes it is true about the reliable web hosting, if Google finds your site is down they cannot update your site in their database but I’ve heard or read somewhere that the spider will give you the chance and it will visit again the next time before dropping your site in their index.