Search Engine Marketing
6 Thoughts on Google search challenges for 2010
January 2, 2010
2
, ,

There is a lot of talk about search engines and spam. Matt Cutts works his butt off trying to rid the SERPs of manipulation. On my end of the spectra, I am quite concerned. Not about the manipulation, but I am concerned of the measures being taken against it. Why do I say this? Well, basically the problem lies within links and the value that Google gives them.

Building links is not difficult

There are so much links out there nowadays that it is not difficult to build them. Social media has completely taken away the need to buy links as well. You buy content instead. Or websites, as a link building consultant said the other week. If you’re in the market for content you know you pay extremely little for about 100 articles, but you spend a fortune on 100 links. 100 domains aren’t expensive at all and thus you can build them back links without having to worry about their validity, their monthly status etc. It is much more simple and a lot better.

That’s quality links. Then you combine that with some track back spam to get the nofollow and domain name back links to your content and you start getting the volume as well. Skip the catalogs and the directories as it takes to much time and those generally don’t have enough content surrounding the links to make them worth anything. Perhaps a few good news aggregators where you can guarantee that the co-citation is good. Put your Rss in a nice mix in them. Can be achieved by using Twitter as well. No worry that there is no follow. The importance is that you get your links in with those other quality content links within your vertical.

Affiliate is not necessarily bad content

In the more mature verticals you surely see that the quality of the content has a high bearing on the popularity, the ranking and the maturity of the affiliate. I won’t out any examples here, but let’s just take it for what it is and then we can sort the examples via Skype if you don’t agree with me.

The problem is that a lot of good content still don’t get on top as Google continue to put so much value in the links. The big affiliates can afford to build content and quality to match their position. The people who started out building quality content and not links, they are stuck in the race for the top positions.

We still see obvious spammers get to the top of the SERP’s and we see a lot of quality content being penalized. All because Google value links to high and don’t take content into consideration. However, we see a lot less spammers than before. Often this comes with the cost of penalizing sinners that care about their content as well.

I’m fine with that, but some spammers seem to get a green card in the SERPs whilst others, that only try to compete on equal terms get the Cutt-Fist up their arse faster than a rabbit… The problem lies within the links. If Google would care less about the links and more about the content I think we’ll all be better off.

One way to care less about the links would be to pay more attention to the searcher. Not in the sense that personalized search does which I will cover below, but with respect to what the searcher is doing just as he or she searches for the specific query. I know there are some integrity aspects in this. But Google already stores the data. So why not use it. This is the only reason I would like to talk about REAL TIME web. If I search for one thing and then for another. Then I want my second search to be dependent upon my first search if in any way related.

Let’s say I search for “potatoes” and then I make my second search “chips”. Then I want Google to assume that I am searching for “potato + chips” and not “computer + chips” or “some dude called Chip”. Users are used to searching several times before making their decision. So instead of focusing on links, then focus on the search and the sequence of searches. You can also give it the very nice engineering name “Sequential Search” and launch it as something very nice on one of your conferences. :)

Cause what you’re doing now for quality is actually worsening your index.

1. Load time variable gives spammers the upper hand

When Google starts looking at load time as a variable they aren’t doing their job any better. C’mon Matt. Good, and unique content sometimes takes a while to load. You promote text content and code quality when you say you give load time extra value.

Think about it Googlers out there. Who does this actually give the upper hand? Well. Who’s got the leanest and cleanest code out there and the capability to really meet this requirement. Well, spammers. Naturally, the people willing to manipulate and work hard with their code, content and page structure, who do not really care about the user experience and who only want to rank high and then convert traffic. These are the people who will have the fastest content out there and thus load the fastest.

2. Brands as a mean to make SERPs cleaner

I never understood the brand infatuation. I think it sucks that the big institutional brands get an easier time in the search engines. New-comers in mature markets used to have the upper hand as they were fast movers and could adjust. What the brand prerequisite does is it cripples these fast movers.

3. Personalized search

F**K OFF… I don’t know who is interested in getting search results based on all their history since two years back. Two years back is not relevant for now. Not even last time I went to Google for an answer is important for the search I am doing right now. Only what I am searching for right now is important for what I am searching for right now. When I am searching for it I want the best possible match from the WHOLE world presented to me and not what I have previously visited and liked.

Perhaps my range is limited, and I know this. Then I want suggestions of what others like and I don’t want to go to Twitter to find it. I want it to be present in the Google SERP of the bat.

4. Rich Snippets

One good change in this direction was the rich snippets. But you really have to make this better as the website developer has to include it in their setup of the page. You can find a comment when there is a comment. You can find a rating when there is a rating. You don’t need us to point this out in our code. Help us out in this area and present it in the search results to the user. That is an extremely good way to improve the search results.

5. Real Time Search

Sorry, but I don’t get this. If I am searching in Google then I want static search results. If I search on Twitter, then I want Twitter now results. Don’t mix them together. It Sucks!! I don’t want those “real time” search results to clutter my view when I am looking for resources on the web.

Users don’t use Twitter cause it gives better results then Google. They use Twitter as they get replies from their friends that they already have some relationship to. If you present anonymous Tweets in the sense that the searcher don’t know the user who has produced the tweet, then YAAAAWN… irrelevant.

Social search is much better that way. But I really need it to be presented in a new way. The old SERP don’t have room for these features. Universal search is beautiful, but you had to work on that one for a while to make it fit. Don’t force social search. It is a good feature, but it won’t work if you push it in there. It will only work if you make it a process that can be adopted slowly by the users.

6. Opt Out

I know you people at Google like to force shit on the users. But PLEASE give us the opportunity to remove your filters. Please give us the opportunity to OPT OUT. Put the opt out possibilities in an open and extremely easy ajax form top right on the page. Make it easy. That’s all I’m asking. Put a little red text in the bottom of filtered search results if you have to. “This site has been penalized for link spam”. Since the new resubmission process came along you are very transparent towards the site owner what is wrong with the website. Please make this visible to all users so that we know who does their job and who doesn’t.

I guess that’s all for now.

//Jesper

Possibly related posts:

About author

Jesper Åström

Jesper Åström is a digital tactician hired by people and companies all over the world to help solve their digital challenges. He is also a liked educator and business creator and currently develops educational programs in collaboration with Hyper Island in Sweden and Singapore, whilst building businesses in Sweden and Japan. Subscribe to Jespers YouTube Channel

Related items

/ You may check this items as well

SEMKonferansen in Oslo, 16th of September 2010

Tweet There is a lot of talk about search engines ...

Read more

12 Hours of Matt Cutts can Make you see stuff

Ok, I'll have to break e-mail week for this... I l...

Read more

Google Local Business Centre for Google Maps

Found this video on YouTube, and I though I should...

Read more

There are 2 comments

  • [...] This post was mentioned on Twitter by Jesper Åström, Karin Svensson. Karin Svensson said: RT @JesperAstrom: New Blog post : 6 Thoughts on #Google search challenges for 2010 – http://bit.ly/7yfWc3 #SEO #Trends #search [...]

  • [...] Jesper Astrom at 21 January, 2010, 10:53 pm If you recall my post on Google challenges for search in 2010 a couple of weeks back, I mentioned my irritation with real time and personalized search. Matt [...]

  • Leave a Reply

    Your email address will not be published. Required fields are marked *

    You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>