How Google Destroyed the Value of Google Site Search

Do You Really Want That Indexed?

On-demand indexing was a great value added feature for Google site search, but now it carries more risks than ever. Why? Google decides how many documents make their primary index. And if too many of your documents are arbitrarily considered “low quality” then you get hit with a sitewide penalty. You did nothing but decide to trust Google & use Google products. In response Google goes out of its way to destroy your business. Awesome!

Keep in mind that Google was directly responsible for the creation of AdSense farms. And rather than addressing them directly, Google had to roll everything through an arbitrary algorithmic approach.

< meta name=”googlebot” content=”noindex” />

Part of the prescribed solution to the Panda Update is to noindex content that Google deems to be of low quality. But if you are telling GoogleBot to noindex some of your content, then if you are also using them for site search, you destroy the usability of their site search feature by making your content effectively invisible to your customers. For Google Site Search customers this algorithmic change is even more value destructive than the arbitrary price jack Google Site Search recently did.

We currently use Google Site Search on our site here, but given Google’s arbitrary switcheroo styled stuff, I would be the first person to dump it if they hit our site with their stupid “low quality” stuff that somehow missed eHow & sites which wrap repurposed tweets in a page. 😀

Cloaking vs rel=noindex, rel=canonical, etc. etc. etc.

Google tells us that cloaking is bad & that we should build our sites for users instead of search engines, but now Google’s algorithms are so complex that you literally have to break some of Google’s products to be able to work with other Google products. How stupid! But a healthy reminder for those considering deeply integrating Google into your on-site customer experience. Who knows when their model will arbitrarily change again? But we do know that when it does they won’t warn partners in advance. 😉

I could be wrong in the above, but if I am, it is not easy to find any helpful Google documentation. There is no site-search bot on their list of crawlers, questions about if they share the same user agent have gone unanswered, and even a blog post like this probably won’t get a response.

That is a reflection of only one more layer of hypocrisy, in which Google states that if you don’t provide great customer service then your business is awful, while going to the dentist is more fun than trying to get any customer service from Google. 😀

I was talking to a friend about this stuff and I think he summed it up perfectly: “The layers of complexity make everyone a spammer since they ultimately conflict, giving them the ability to boot anyone at will.”

Categories: 

Share This Post

Add a Comment

Required

Required

Optional

Time limit is exhausted. Please reload the CAPTCHA.