I'm breaking my usual pattern and will write this post in English.
As you might have heard, many of my websites* were blacklisted by Google about two months ago. These "wiki-style" Google Maps mashups sites previously had around 50.000 weekly visitors in Sweden, which is quite a lot for a small country. They have rapidly become really popular, with a lot of people adding reviews and pictures and helping out with updates and additions. The sites are, after about a year in existence, by far the most comprehensive in Sweden when it comes to listing for example sushi restaurants, beaches, cafés and churches.
The main reason the sites have become popular is because each specific "interest" has it's own specific web site. **
The blacklisting came as a real surprise to me as I was quite sure my sites didn't break any of the Google Webmaster Guidelines. The issue has since been discussed in the Google Webmaster Support forum, and Swedish Google employee Fredrik Andersson (based in Dublin) referred to the blacklisting as being due to duplicate content. Most people agree that this was incorrect. The sites have very little duplicate content, and Fredriks arguments were not actually about duplicated content, but about "too many of the sites turning up in the search results". Which is basically a result of their popularity.
A lot of people in Sweden thought the blacklisting was unfair, and that the search results in Sweden became a lot worse when these sites were excluded. One Google employee even claimed in Dagens Industri (a major business newspaper) that the blacklisting was a mistake. Nothing happened, however. Yesterday swedish journalist Andreas Ekström blogged about it again.
Now here is what is really exciting. In the comments to Andreas blogg post is Matt Cutts:
" Hi Andreas, my name is Matt Cutts and I’m the head of the webspam team at Google. Although we haven’t communicated as much as people might like externally, we have been discussing this situation quite a bit within Google.
One of the key things that we want Google’s search results to have is a diverse set of different websites. Whether intentionally or unintentionally, we were seeing an unusually large number of Ted’s sites showing up for some searches, which can cause a poor user experience if the results aren’t diverse. I hope to have a chance to discuss this situation with all the relevant people within Google over the next few days.
Google might not always communicate as often as some people would like, but we do listen to the feedback that we get and appreciate it–even when it comes in the form of criticism."
This seems to be the real Matt Cutts, for sure. Wow. Obviously these questions have gone along way, and it seems like the jury is still out on the whole issue. Oh, how I hope the "relevant people" that Matt mentions are Larry Page and Sergey Brin!
So, Matt, if you are reading this:
- First of all I want you to know that I love Google. I am a vocal supporter of the Google ecosystem and try to encourage developers in Sweden, young and old, to join it.
- I agree with you that too many similiar sites on the same search result page is a problem, and I should have been more attentive to these problems from the beginning. Now I have taken a bunch of measures to fix these unintentional results. It is my goal, too, that the Google search results remain relevant.
- However, I think it was wrong of you to kill my business without notice, because of a somewhat vague problem that was not mentioned in the webmaster guidelines.
- Next time you see a problem like this, wouldn't it be better to change the guidelines first, (and give people a chance to adapt) before you start blacklisting websites.
- And I know this might sound like something against your policy... but; wouldn't it have been easier to send me an email asking me to fix this, in the first place, if I wasn't actually breaking any guidelines? I'm sure the people who reviewed my sites know quite well that I'm not a "black hat" SEO. I would have been happy to comply to any suggested changes.
Footnotes:
* Cafekartan.se, Sushikartan.se, Badkartan.se, Kyrkokartan.se, Hotellkartan.se, Wifikartan.se, Campingkartan.se, Gymnasiekartan.se, Vintagekartan.se, Studentrabattkartan.se, Jobbkartan.se and Minkarta.se were all blacklisted, and can not be found even when searching for very obvious search terms like cafekartan. They are still left in the index but their value has been set to close to zero.
Annonskartan.se, Vandrarhemskartan.se and Flygkartan.se were not blacklisted.
** There's even a website for swedish air plane enthusiasts, where Swedish pilots review landing fields in Sweden! All these different sites have a common user login, so people can easily review things from different maps. The goal has been to in due time open up the platform so that anybody, anywhere, can create their own map-communites. A lot like Ning.com, but for maps. (My working name for this is Maploving.com. As with Ning.com and Blogger.com, you can let your community reside on a subdomain, like sushitokyo.maploving.com or wifispots-in-goa.maploving.com, or you can connect your own top domain to it, like I have done with these initial maps.)
11 comments :
Hi Ted, thanks for writing this post in English. That's much easier for me to read. :)
I'm not sure that everyone realizes just how many sites you've got--often when we see someone make dozens of sites like this, it's because they're trying to do industrial-size search engine optimization of all those sites.
The last complicating factor is that I believe you had added a nofollow on the crosslinking between the sites, which is great, because that much crosslinking can be perceived as pretty spammy. But then after you didn't get a reply in the forum, I think you removed the nofollow? So now those domains are cross-linking again.
Putting a nofollow on those crosslinks was a good move to show that you're not just trying to do some sort of link farm where the domains crosslink even when it doesn't make sense for the user. If you'd be willing to re-enable the nofollows so that the domains aren't cross-linking, I should be able to submit a reconsideration request on your behalf.
Hi Matt!
I don't think I can underline enough how happy I am to hear this.
This is what I have done:
I have added "nofollow" to all crosslinks between the sites...
http://www.flickr.com/photos/26238837@N07/3657662562/sizes/o/
http://www.flickr.com/photos/26238837@N07/3657739532/sizes/l/
... *except*, though, for the navigation menu on the index page:
http://www.flickr.com/photos/26238837@N07/3658055167/sizes/o/
Do you want me to add "nofollow" to these index page links as well? (Just reply yes/no and I'll comply without question.)
I have also added "nofollow" to a lot of the internal links.
And "noindex" to ALL geo-pages, like this one:
http://www.flickr.com/photos/26238837@N07/3657662562/sizes/o/
If you can find any other links that seem in need of alteration, please let me know?
During the course of this I haven't pulled back any nofollow/noindex changes I've made, only added more.
Yours, Ted
On a plane, so I'm on a phone, but I'll check it out when I land. :)
Matt
It would still be very interesting to hear what Matt has to say about the issues (3 and 4) written by Ted...
Why are you blacklisting sites that are not violating your rules without prior notice?
Matt, very nice of you to engage in this conversation to help Ted.
However the double standard here is not OK, so explain why Ted can't do his simple cross linking of similar type site, albeit with different topics, when you can allow some of the strongest sites on the net to crosslink (CBS Network for example).. To complain even further on CBS sites the always crosslink in the footer with heavy anchor text and to totally irrelevant sites or pages on their other sites..
To be fair, it's not only CBS who does this but both you me and Ted knows that CBS has plenty more sites then he does.
Have you ever slapped CBS on the wrist with a ban? or do you have any intention of doing so or at least making it the same rules for all site owners?
Hi Ted, after looking into it, I would recommend adding nofollows on the cross-linking in the navigation menus as well.
Anonym, we look at a wide variety of factors, including how much work is being put into each site, the number of sites, the value-add of each site, etc. The example you mentioned (CBS) puts a lot of work into developing each of those domains. Ted has many domains, and the danger is that without enough value-add or if users get results from lots of the different domains, that can be a bad user experience.
Hi Matt!
Now I've added nofollow on the index page navigation links as well. Hope it looks alright...!
As the previous poster pointed out, a lot of sites crosslink. Here are some examples from Sweden:
http://www.flickr.com/photos/ted_valentin/3743089350/sizes/o/
Just a few examples, there are obviously many more.
Is there anything about cross linking in the webmaster guidelines?
Thanks, Ted. Excessive crosslinking of large numbers of sites (esp. if the sites are primarily autogenerated or towards the lower end of the value-add spectrum) would fall under the link scheme part of our guidelines: http://www.google.com/support/webmasters/bin/answer.py?answer=66356
With that change to reduce the cross-linking, I'll submit a reconsideration request on your behalf.
That's a very straight answer and a compelling argument.
I'll be happy to comply, and I'm looking forward to going back to doing what I enjoy most and do best; building and improving my websites. (Making them climb further up the value-add sprectrum.)
Thank you for checking in on us here in Sweden!
A note for all you other readers. When things are back to normal I will write a longer blog post outlining:
* Advice for other web developers
* A few constructive ideas for Google
* All the lessons I've learned in the last two months
Recently a lot of people in Sweden (especially in traditional media) have been criticizing Google, but most of the time for the wrong reasons. The task of providing relevant search results is unbelievably complex - and Google does a very good job at this! That is why we use Google, and not some other search engine. However, even Google can not ever be infallible. (No one can!)
Matt recently wrote write a great good blog post about Googles internal/external image, that you can find here:
http://www.mattcutts.com/blog/taking-google-feedback/
Happy to help, and hope the other comments make sense. I haven't made it to Sweden yet (I've been maybe 20km away before), but maybe sometime I'll make it to Sweden in person. :)
Post a Comment