Digital Marketing Blog
Search Engine Optimization
April 6, 2010
I talked to Christian Rudolf tonight about search. It is not uncommon that we talk about links, blog posts and what is happening in the onlinedom. We started talking about what the true essence of search optimization is really about. We both agree that it is about meeting the visitor with exactly the content they want to find for that query. In many cases we believe we are better than Google at determining this for the simple reason that we are people and they are mainly an algorithm.
I can’t talk for Christian, but that is at least why I have no problem bending the rules in the Google guidelines to achieve results. That’s for my own projects. Most of my clients have enough link steam that they just have to build their websites and fill them with findable content, and they rank in prominent positions as a result. They are fortunate.
The hidden web is still concealed
There is so much good content on the web. Only a fraction of it is today published in a format that is accessible through the search engines. There is a huge resource in the “hidden web” that are yet to be unleashed. Once we start making more data formats available for search and clustering, there will be a completely new set of search results a user will be able to browse through.
Today we use related information as a result of meta tags. Imagine the same relevance, but instead of linking documents you are linking pieces of data. I know it is difficult to imagine, so I’ll try to give you an example. What if you traveled to Hamburg and you wanted to eat a hamburger in a restaurant that served good food close to the hotel.
How linked data really works
The search results currently being returned for a query like that is a very broad approximation of what your keywords might really mean. But if you are able to link data, then the search engine will be able to take the place Hamburg, which it knows is a city in Germany, then look at the data sets of distances to restaurants and match these with online reviews for the returned venues. That way they will be able to return a result to you, not based on keywords, but based on the meaning of your search.
Sounds pretty sick right? And for now we have to use our best manipulation, architectural and infrastructure knowledge to try and create, promote and justify content that is relevant enough for you to feel satisfied with your search result. As Google is a bot, they would never be able to return as relevant search results if it wasn’t for the work of search engine optimizers. We make relevant content meet the users need, and not only their typed in keywords.
And that’s in a sense why SEO’s are like Internet Doctors
I suppose we’re a bit like Internet doctors. Sometimes your stomach pain means you are lactose intolerant, and sometimes it means you are just bubbly from a night out partying. Yes, we earn money from our practices, and yes, some of us do bad deeds and promote content that doesn’t meet the quality standards you would expect. But the vast majority of us are not in it do destroy, we’re in it cause we love the internet (use google translate if you don’t know Swedish) and want to help improve it.
As search is such a great part of the Internet experience, we have chosen this mechanism as our arena to help you find your stuff. I too wish that we could rid the cancers of the Internet body, but as with all open, free systems, there are people who are willing to abuse them. I am no saint. On the contrary I have done a lot of things I am not particularly proud of. Although I try to justify them in the name of science I still feel bad when I set out an experiment that effects the web in a bad way.
And some unusually distorted final thoughts
To end with the beginning, and my discussion with Christian. There are a load of misconceptions about what search engine optimization is all about. I believe it is all about finding out what the searcher wants and then deliver the best possible content for the query that need is represented by. Almost like a doctor.
It is late and I just wanted to get this out of my system so that I could sleep.
January 2, 2010
There is a lot of talk about search engines and spam. Matt Cutts works his butt off trying to rid the SERPs of manipulation. On my end of the spectra, I am quite concerned. Not about the manipulation, but I am concerned of the measures being taken against it. Why do I say this? Well, basically the problem lies within links and the value that Google gives them.
Building links is not difficult
There are so much links out there nowadays that it is not difficult to build them. Social media has completely taken away the need to buy links as well. You buy content instead. Or websites, as a link building consultant said the other week. If you’re in the market for content you know you pay extremely little for about 100 articles, but you spend a fortune on 100 links. 100 domains aren’t expensive at all and thus you can build them back links without having to worry about their validity, their monthly status etc. It is much more simple and a lot better.
That’s quality links. Then you combine that with some track back spam to get the nofollow and domain name back links to your content and you start getting the volume as well. Skip the catalogs and the directories as it takes to much time and those generally don’t have enough content surrounding the links to make them worth anything. Perhaps a few good news aggregators where you can guarantee that the co-citation is good. Put your Rss in a nice mix in them. Can be achieved by using Twitter as well. No worry that there is no follow. The importance is that you get your links in with those other quality content links within your vertical.
Affiliate is not necessarily bad content
In the more mature verticals you surely see that the quality of the content has a high bearing on the popularity, the ranking and the maturity of the affiliate. I won’t out any examples here, but let’s just take it for what it is and then we can sort the examples via Skype if you don’t agree with me.
The problem is that a lot of good content still don’t get on top as Google continue to put so much value in the links. The big affiliates can afford to build content and quality to match their position. The people who started out building quality content and not links, they are stuck in the race for the top positions.
We still see obvious spammers get to the top of the SERP’s and we see a lot of quality content being penalized. All because Google value links to high and don’t take content into consideration. However, we see a lot less spammers than before. Often this comes with the cost of penalizing sinners that care about their content as well.
I’m fine with that, but some spammers seem to get a green card in the SERPs whilst others, that only try to compete on equal terms get the Cutt-Fist up their arse faster than a rabbit… The problem lies within the links. If Google would care less about the links and more about the content I think we’ll all be better off.
One way to care less about the links would be to pay more attention to the searcher. Not in the sense that personalized search does which I will cover below, but with respect to what the searcher is doing just as he or she searches for the specific query. I know there are some integrity aspects in this. But Google already stores the data. So why not use it. This is the only reason I would like to talk about REAL TIME web. If I search for one thing and then for another. Then I want my second search to be dependent upon my first search if in any way related.
Let’s say I search for “potatoes” and then I make my second search “chips”. Then I want Google to assume that I am searching for “potato + chips” and not “computer + chips” or “some dude called Chip”. Users are used to searching several times before making their decision. So instead of focusing on links, then focus on the search and the sequence of searches. You can also give it the very nice engineering name “Sequential Search” and launch it as something very nice on one of your conferences. 🙂
Cause what you’re doing now for quality is actually worsening your index.
1. Load time variable gives spammers the upper hand
When Google starts looking at load time as a variable they aren’t doing their job any better. C’mon Matt. Good, and unique content sometimes takes a while to load. You promote text content and code quality when you say you give load time extra value.
Think about it Googlers out there. Who does this actually give the upper hand? Well. Who’s got the leanest and cleanest code out there and the capability to really meet this requirement. Well, spammers. Naturally, the people willing to manipulate and work hard with their code, content and page structure, who do not really care about the user experience and who only want to rank high and then convert traffic. These are the people who will have the fastest content out there and thus load the fastest.
2. Brands as a mean to make SERPs cleaner
I never understood the brand infatuation. I think it sucks that the big institutional brands get an easier time in the search engines. New-comers in mature markets used to have the upper hand as they were fast movers and could adjust. What the brand prerequisite does is it cripples these fast movers.
3. Personalized search
F**K OFF… I don’t know who is interested in getting search results based on all their history since two years back. Two years back is not relevant for now. Not even last time I went to Google for an answer is important for the search I am doing right now. Only what I am searching for right now is important for what I am searching for right now. When I am searching for it I want the best possible match from the WHOLE world presented to me and not what I have previously visited and liked.
Perhaps my range is limited, and I know this. Then I want suggestions of what others like and I don’t want to go to Twitter to find it. I want it to be present in the Google SERP of the bat.
4. Rich Snippets
One good change in this direction was the rich snippets. But you really have to make this better as the website developer has to include it in their setup of the page. You can find a comment when there is a comment. You can find a rating when there is a rating. You don’t need us to point this out in our code. Help us out in this area and present it in the search results to the user. That is an extremely good way to improve the search results.
5. Real Time Search
Sorry, but I don’t get this. If I am searching in Google then I want static search results. If I search on Twitter, then I want Twitter now results. Don’t mix them together. It Sucks!! I don’t want those “real time” search results to clutter my view when I am looking for resources on the web.
Users don’t use Twitter cause it gives better results then Google. They use Twitter as they get replies from their friends that they already have some relationship to. If you present anonymous Tweets in the sense that the searcher don’t know the user who has produced the tweet, then YAAAAWN… irrelevant.
Social search is much better that way. But I really need it to be presented in a new way. The old SERP don’t have room for these features. Universal search is beautiful, but you had to work on that one for a while to make it fit. Don’t force social search. It is a good feature, but it won’t work if you push it in there. It will only work if you make it a process that can be adopted slowly by the users.
6. Opt Out
I know you people at Google like to force shit on the users. But PLEASE give us the opportunity to remove your filters. Please give us the opportunity to OPT OUT. Put the opt out possibilities in an open and extremely easy ajax form top right on the page. Make it easy. That’s all I’m asking. Put a little red text in the bottom of filtered search results if you have to. “This site has been penalized for link spam”. Since the new resubmission process came along you are very transparent towards the site owner what is wrong with the website. Please make this visible to all users so that we know who does their job and who doesn’t.
I guess that’s all for now.
Possibly related posts:
- Related posts on Search Engine Optimization
- Bookmarkindo Blog » Blog Archive » Search Engine Optimization UK …
- Bookmarkindo Blog » Blog Archive » Search Engine Optimization …
- Related posts on Trends Google
- 4 Easy Ways to Build a List
December 23, 2009
Found this video on YouTube, and I though I should share it as it is an important piece of the puzzle when taking control of your own brand online. As people are searching through many different means, this is perhaps the best way to make an instant impact on the search rankings.
Now you might ask why that is so? Well, with universal search in place, Google tries to return the best possible page for your query. In some cases, such as when you search for “cafe Stockholm”, you’ll receive something looking a bit like this:
It appears on top of the search engine and takes up A LOT of space. Now imagine if you were a legal firm, a store, a business consultancy? Wouldn’t you want to be the one taking the top place for words such as “legal firm stockholm” or “consultancy stockholm”. I bet you would and that is why it is important that you fill in your Google Local Business Centre data.
People searching for locations usually want to see a map so I believe Google will try and place these pages on top of more location searches during the next year.
Possibly related posts:
- Related posts on Business and Google
- Google News Makes A Concession To Whining Publishers: Only First …
- 3 Local Marketing Initiatives with Higher ROIs « Marketing with …
- Related posts on Google Maps
- Google Mapping Tool Availability Matrix | Understanding Google …
- Google Maps: A (mis)Guided Tour of Olean | Understanding Google …
- Related posts on Search Engine Optimization
- Search Engine Optimization Services – The Best Way Forward …
- Search Engine Optimization for Blogs – Different Altogether …
November 11, 2009
You know a couple of weeks ago, we were discussing duplicate content here on jesperastrom.com. There were some people saying that duplicate content is not penalized, but it is “filtered out”. I said that this was not correct and I went to work trying to find evidence for my statement.
I have seen this happen sooooo many times. But you know what happens when you say you can get plenty of examples… you find none.. I got insecure… what if I was wrong. But NO… I wasn’t. Perhaps not a news publishing website, but it proves that the original source gets penalized when a stronger domain scrapes the content of a weaker domain.
Filtering vs. Penalization
With penalization I mean that Google would push my page downwards in the SERPs because of that the content exists elsewhere. This means that if my content was supposed to be on page 1, then it is all of a sudden on page 3 or 5 or 10 or not in the index at all. with filtering they mean that you get grouped under the little thing called “similar” in the SERP snippet.
Introducing Exhibit A
A couple of days ago I posted my Link building in Social media post. As it happened, it received a lot of RT’s on Twitter as well as a few decent inbound natural links. I actually skidded to the first page of the SERP for the term – Link building Social Media – which gave me quite some traffic.
However, the traffic only lasted a few days. Then came monster MIXX and screwed things up for me. Some bastard submitted my url to the MIXX website. As we all know, that domain is pretty nasty in strenght. It produced a clear case of duplicate content as it stole the title, excerpt etc. I thought this would be save though as what is prodced on MIXX is merely a link to my website and description.
This was NOT the case. Instead of my website being the first listed in the SERPs, the MIXX page is listed on the second page of the SERP. There is NO sign of “filtering out” effects, no “similar” attribute to the snippet for the MIXX search result.
If this would have been a clear cut of “filtering out”, then my search result wouldn’t appear at all further back in the SERP. However, only one page later, I find my post, however, Google has changed the title of the post that is displayed in the SERP. (YES they can do that, and they DO, do that).
My website, that has the original content, that MUST be the most relevant on the topic, has been pushed down as a result of a stronger domain publishing the same content on its platform. So… do we call this penalization or filtering out… hmmmmmmm… You tell me!
September 10, 2009
I will now continue my SEO 101 guide with explaining where to put the keywords for a particular page.
Just as you arrange your ordinary word document that is longer than just a page, you construct an index so that all people reading your document will have an easy overview of what they are reading. On a website this index is called the sitemap. (Now some of you think… “Isn’t that the start page??” NO IDIOT… I will deal with the start page later… but keep in mind that you do NOT… I repeat… do NOT put all the links to all your content on your start page)
In the sitemap you should put links to all the pages of your website. As you want your reader, google and all other interested to know where they end up you use the “jam jar method” to lable your links and the pages where the people will end up.
Jam Jar Method
Just as you don’t put “Sweet and sticky” on a jar of strawberry jam (because no one knows what’s in it) you don’t put a slogan or something catchy in a link to a page. You simply say what’s on the page or in the jar. As for the case of the Jam jar you would write “Strawberry Jam”, as for a web page explaining the recepie for strawberry jam, you would give it the name “Recepie – Strawberry Jam” or “Recepie for strawberry jam”.
Many people put catchy phrases in their links, or even worse, put “read more” in their links leading to pages. This method sucks as it doesn’t give the reader any notion on where he or she is going to end up.
On-Page keyword optimization
Once you are on the page there are some elements that search engines as well as people find more important than others.
First of all it is the title of the page. This is what is seen as the link from Google. It is also what is seen in the field on the very top of your browser. The title is defined in the head meta text of the page. The tag you use to define it is <title>Your title</title>.
February 2, 2009
A keyword site-map should be created for any web project you engage in and that you aim to launch through the search engines. Not to confuse with the Google XML sitemap that should enlist all your web pages contained in your website, the keyword sitemap should contain:
- all your used keywords
- on what page your keywords are located
- how different pages are interlinked
- what external links you look to attain for each page
- what internal links should point towards this page
- it should also show what co-citation cluster you want to put your page
You should start out by listing your most important keywords. Then find their semantic neighbours by using Google Sets or Google KeywordTool. Each main page which a designated keyword needs to contain three sections:
Some information that changes regularly because you want users and the search engine spider to like this page, return to it often and think of it as relevant and important Read More
January 31, 2009
Extending keywords is a practice where you extend the concept of viewing your keyword as a theme. As it is meant to define your page, you should really try to include whatever words can complete your theme. If your main keyword or theme for the page is car insurance for example, you should think of whatever other words are related to car insurance.
Such words can be found in either Google KeywordTool or by using Google Sets. You can also search for you page keyword and look at what pages end up on top in the search engines. What other words, related to the word you are working with, are included in the top three results. I generally use SEO Quake to find what words have the highest density on top ranking webpages for the keyword I am pushing for. Read More
January 31, 2009
What is a keyword?
It is debatable whether or not a keyword is one word or if it is a theme. What we know is that a keyword defines the content of a page. There is a misconception that you can have several keywords on one page. Each page should be optimized around its own keyword. No other pages on a website should be optimized for the same word.
- …defines the page
- …is unique
- …matches a potential search queery from a user
January 28, 2009
As the web seems to be completely lacking of Search Engine Optimization (SEO) guides (not) I figure I have to make a run for it. Jokes aside, I haven’t found a really good guide that takes the user from a very basic level to a reasonably good level.
If you are a novice user who wants to learn more about Search Engine Optimization, you have come to the right place. Read More