• Background Image

    Digital Marketing Blog

    SEO

May 8, 2010

Deciding on your SEO Strategy when Your internal beef is what holds you back

Let’s say you have an online presence and you have some traffic to your website. Actually, let’s say you have a huge website, but the traffic you receive is less than 20% of your overall incoming traffic. Let’s also say you have a couple of different internal departments that all quarrel about who can own the web presence. Let’s also assume there is a big dude from the HR department who always wins all the internal arguments and thus your index page and sub sections are cluttered with Employer Branding banners.

What the heck can you do?

To solve this kind of problem in SEO you need to understand more than keyword analysis and density. You need to really grasp which pages on your website are the most important and utilize them as negotiation tools in the internal debate. But before you even start to think about the negotiation you need to figure out some other things. So let’s start from the beginning.

Step 1: Who is the internal beef instigator – Key internal stakeholders

First of all you need to get to know who has an interest in the website you are going to suggest a strategy for. If you do not know who to get in line, you will have a huge problem implementing your SEO strategy. You need to know what their business goals are and you need to know what internal arguments and history they are relying on.

Step 1a: Measure everything

One thing I forgot. Before you start anything. You should really start measuring everything of SEO value.

  • Inbound links
  • Conversion rates
  • Keyword traffic density
  • Long tail keyword structure and variance
  • etc.

Step 2: Collect internal KPIs

The second step you need to go through is to collect internal KPIs. These are highly different from the external KPIs that have an effect on the business success of the online presence. Internal KPIs are most often based upon pride or some sort of seating pecking order. In order to get to know this you really need to understand people more than you necessarily understand SEO. Once you’ve found these internal KPI’s you can move on to the next step.

Step 3: Determine external KPIs

These KPIs are the ones that determine the overall business success of an online presence. With regards to SEO these are usually based on:

  • Inbound links to a webpage
  • Decrease of duplicate meta/content
  • Increase CTR to Success event on a specific landing page

Pretty straight forward KPIs that have to be put into a methodology later in the process in order to make any sense. It doesn’t matter how many KPIs you determine if you don’t set a method to act upon these KPIs. You need thorough analysis with regard to business goals and you also need to make sure that you have the work force suitable for this kind of task.

Step 4: Set intersect values or alternative values

The next step is more complicated than the two above. In order to set intersect values you need to put initial and comparable values to the two previous steps. This means you have to match the internal value of eg. having a banner on the front page with the external value of increasing revenue on a landing page. What is it worth to a person to increase his or her position in the internal pecking order contra how much do they value business success.

It might sound wicked, but most people do not work towards fulfilling external KPIs and thus they are not willing to walk the extra mile for that purpose. People are much more likely to work towards the internal KPIs and thus you need to align the external and the internal KPIs in order to really get to know how to build a governance implementation that truly works for your SEO strategy.

Step 5: Incentive models that increase value of external KPIs

Once you have set intersect values you can move onto creating incentive models. SEO in big organizations are very little about doing it right, but it is more of avoiding doing it wrong. If you have many people updating the pages, you need to support the system with incentive models that actually gives an incentive for people to do a good job.

Step 6: Negotiation

Once you’ve defined and sorted the above you shall get into negotiation with your internal stakeholders. You know what they value and you know what brings them external value or business success. This gives you a load of arguments in order to build a scenario that is favorable to the SEO strategies available for you.

Let’s say you have the big HR person who owns you like a bitch in the internal dialogue and rankings. If their ambition is to maintain their position on the front page of the website as that holds a high internal value, then it is up to you to increase the incentive to move to other pages. One such way could be to measure the number of inbound links to different other pages, use the high value ones to sculpt sections of the website to rank higher and thus draw better rankings, traffic or conversion.

If you get into the negotiation with real numbers you will be able to maneuver the internal bully quite beautifully. Regardless if you choose the conflict/confrontation with upper management method or the more diplomatic one on one approach, you’ll do a lot better now that you can present a clear incentive to the table.

Step 7: Creation of strategy and tactics

Now that you know the position of the internal stakeholders you know what SEO strategy will be able to implement. You have a true and comparative ROI that enables you to know before you get into an argument if it is worth it to move a specific stake holder towards a more favorable position by giving them increased incentives to external KPIs.

Argumentation will be so much easier if you do the ground work and can prioritize your own efforts when creating the strategy for SEO.

Step X: Optimization

SEO strategy has everything to do about internal content production, governance and resource management. Due to the industrial structure of many large companies you really need to focus on the interpersonal aspects when determining where to go next. A standard organization is split into Marketing, Communication, Sales and IT in one form or another. These kind of structures aren’t well optimized for modern online work.

If you aim to reach goals in SEO you need to find ways around the internal hierarchies and find solutions that do not require major organizational rearrangements. The above sets the maneuver platform for that to take place. Yes, this is really difficult to sell, and you will loose a lot of pitches to less qualified people, but, if you manage to sell this you will do a better job for the client and that is what it is all about in the end.

If I just confused you, or if you don’t agree. Then please write comment below.

Possibly related posts:
May 2, 2010

How to use Site Wide Links in your Link profile

I have written about site wide links previously. Mainly because many people regard them as junk, whilst I regard them as highly useful if used in the correct way. A site wide link is such a link that appears on all the pages of one website.

For example, a standard blogroll is a site wide link structure. A top navigation is also a site wide link type as it appears on all your pages, linking to the same pages. I use three general rules for site wide links pointing towards my website that I can “control”.

1. Point to root domain

All site wide links shall point to my root domain or to a sub domain. This means that you shouldn’t use site wide back links to individual pages other than your index page such as www.example.com and not to a www.example.com/category/somecategory.php.

2. Narrow or related topic

The website that link to me has a narrow topic that is closely related to my website OR completely off topic and have a keyword cluster that does not use the same words as me. The first of these two almost go without saying. If all pages linking to me are relevant for my main topic, then it is relevant to have a link from all of them to my root domain. However, the second of these two is probably a bit more controversial to claim.

3. Meta and content on linking pages

The title and description of the linking pages, as well as the majority of the content on the pages shall be unique. This means you cannot auto generate this content, and thus these links if you cannot assure yourself that you produce “unique” content. This advice can at first be seen as the most basic and retarded thing to say, but if you are aiming at the branding angle, then you shouldn’t forget the spam risk you take if you don’t make sure the content gets thorough work. It is also less expensive creating unique content than for example buying links.

Anchor text & co-citation considerations

Now, it is very important that the keywords used in the linking anchor text can be found in your domain, and in that order as well. Cause think about it. The search engines have had to adopt to the social web with blogrolls as a good way of determining who is a brand and who is not. The blogroll usually links with either the name of the author or the name of the blog as the anchor text. Those words linking are usually found in the domain. And the links usually point towards the root domain. This make these types of site wide links look and feel natural.

Now you should use site wide links with caution. If you get site wide links from a website that uses the same “main keywords” as you do, but they do not use them in the same context, then Google and other search engines might get confused about what type of the keyword you should be ranking for. This is of less importance though. More important are the other pages the linking page is also linking to. Ie. the neighborhood you get placed in as a result of the co-citation.

You shall make sure that you only end up the same context as other pages within your sector. Don’t put your link on a page with links that point to untrusted websites. It doesn’t matter too much if the other pages that are linked to are off topic, but it does matter if they are penalized or not trusted.

Some final thoughts

Remember that you need links, but also remember that they shall look natural, even if you build them. Try to imitate patterns you find online. This is one pattern amongst many others. Use it with care and don’t spam unless you can handle the downside and the risk. Always… I mean ALWAYS focus on your content quality first, second on your website architecture, third on the incentives to link to you, fourth about your social features and objects and fifth about ways to get good links for free (for example from partners, trusted catalogs and fans)…. THEN and only then think about less/more creative methods that will take you from good to great.

//J.

Possibly related posts:
April 6, 2010

Search Optimizers are the Internet Doctors

I talked to Christian Rudolf tonight about search. It is not uncommon that we talk about links, blog posts and what is happening in the onlinedom. We started talking about what the true essence of search optimization is really about. We both agree that it is about meeting the visitor with exactly the content they want to find for that query. In many cases we believe we are better than Google at determining this for the simple reason that we are people and they are mainly an algorithm.

I can’t talk for Christian, but that is at least why I have no problem bending the rules in the Google guidelines to achieve results. That’s for my own projects. Most of my clients have enough link steam that they just have to build their websites and fill them with findable content, and they rank in prominent positions as a result. They are fortunate.

The hidden web is still concealed

There is so much good content on the web. Only a fraction of it is today published in a format that is accessible through the search engines. There is a huge resource in the “hidden web” that are yet to be unleashed. Once we start making more data formats available for search and clustering, there will be a completely new set of search results a user will be able to browse through.

Today we use related information as a result of meta tags. Imagine the same relevance, but instead of linking documents you are linking pieces of data. I know it is difficult to imagine, so I’ll try to give you an example. What if you traveled to Hamburg and you wanted to eat a hamburger in a restaurant that served good food close to the hotel.

How linked data really works

The search results currently being returned for a query like that is a very broad approximation of what your keywords might really mean. But if you are able to link data, then the search engine will be able to take the place Hamburg, which it knows is a city in Germany, then look at the data sets of distances to restaurants and match these with online reviews for the returned venues. That way they will be able to return a result to you, not based on keywords, but based on the meaning of your search.

Sounds pretty sick right? And for now we have to use our best manipulation, architectural and infrastructure knowledge to try and create, promote and justify content that is relevant enough for you to feel satisfied with your search result. As Google is a bot, they would never be able to return as relevant search results if it wasn’t for the work of search engine optimizers. We make relevant content meet the users need, and not only their typed in keywords.

And that’s in a sense why SEO’s are like Internet Doctors

I suppose we’re a bit like Internet doctors. Sometimes your stomach pain means you are lactose intolerant, and sometimes it means you are just bubbly from a night out partying. Yes, we earn money from our practices, and yes, some of us do bad deeds and promote content that doesn’t meet the quality standards you would expect. But the vast majority of us are not in it do destroy, we’re in it cause we love the internet (use google translate if you don’t know Swedish) and want to help improve it.

As search is such a great part of the Internet experience, we have chosen this mechanism as our arena to help you find your stuff. I too wish that we could rid the cancers of the Internet body, but as with all open, free systems, there are people who are willing to abuse them. I am no saint. On the contrary I have done a lot of things I am not particularly proud of. Although I try to justify them in the name of science I still feel bad when I set out an experiment that effects the web in a bad way.

And some unusually distorted final thoughts

To end with the beginning, and my discussion with Christian. There are a load of misconceptions about what search engine optimization is all about. I believe it is all about finding out what the searcher wants and then deliver the best possible content for the query that need is represented by. Almost like a doctor.

It is late and I just wanted to get this out of my system so that I could sleep.

March 23, 2010

A Beginners Guide to Keyword Research

I read a tweet today about keyword research. I decided to go nuts. Thus I also got the inspiration for this blog post. See, this is the thing with Search engine optimization, it is actually kind of an art form. Regardless of how many of the rules you know, there are always exceptions and if you don’t know these exceptions you will end up biting the dust.

This article is intended for those of you who know about SEO, but are still a bit confused where to begin your research for those words/phrases that will start to bring you qualified traffic. It is also intended for those of you who think SEO is bullshit and think that it is all about making your pages readable for the search engine spider. And remember… this is just the basics.

What is keyword research?

So. What is this thing called “keyword research”? Well, it is not about what you should put in the meta-keywords between your header tags. It is neither a process where you choose ten keywords and then try to find out how to rank for them. Keyword research is all about finding what kind of words/phrases people search for when they are up to doing some kind of thing that is beneficial to your overall business goals. Thus, keyword research denotes the process where you turn to all accessible data and try to figure out what the user search for when they try to buy your stuff.

However, it is not as simple as finding ten keywords and then you’re done. Keyword research also enclose the practice of finding related terms to one concept you are trying to sell. This means that you should try to create information that meets the user throughout their complete purchasing process.

If you manage to both find good quality content that corresponds to the purchasers and the information seekers out there, then you are a very happy person.

How to find the right keywords

I wrote an article about this a while back and it kind of sucked. However, I will write briefly how you can go about finding keywords in some bullets below. If you have any questions about this, please ask.

  1. Use your metrics tool to see what people are searching for in your internal search engine
  2. Secondly, ask yourself what words you believe you should rank for
  3. What does the search volume look like for the words that you have found
  4. What does the competition look like (search for the keywords and look where you are today)
  5. Are there any synonyms to the keywords you have listed
  6. Who ranks for those words today
  7. Get SEO Quake as a firefox plugin and check the keyword density of some landing pages of your competition
  8. Do the same for 5 top bloggers of your vertical
  9. What kind of words, phrases do they use
  10. Now that you have a keyword list of at least 500 words (oh yeah baby) you go to Google Sets to find some more

Now these ten first advice were for those of you who are working from scratch and want to get some kind of sense into what the heck you should put in your titles, links etc. to reach your target audience the right way.

When looking at competitive verticals or when you look at PPC then you should aim at finding lists of keywords that at least border the sum of 20-50 000 words and phrases. All depending upon how big of a search volume you have in your vertical. Remember that Google announced that 25% of the searches each month are new to them. This means that you should work with both content that you know generate qualified traffic (yeah… you will know from your stats) as well as experiment with new content (yeah.. you will know what works from your stats).

Tools to use that will help you out

If I were you I, which I am not, but let’s say I was. Let’s say I was you and I had a budget for marketing purposes. Let’s say that budget was at least $500 per month (I know all my readers are as poor as I am and you think; “500 dollars!!! Gosh jesper… who the heck has 500 dollars to spend in one single month…). But, let’s all pretend that we were rich and had 500 bucks to spend on tools per month.

Which tools would I recommend you chose to help you out in your keyword research? (promise affiliate link free)

  1. Webmaster Toolkit – this one is FREE
  2. Wordstream – $49 bucks if you’re a SEO, $299 if you’re a PPC:er – and you’ll understand the price difference once you have a look at it
  3. Either Cemper Link Research or Cemper Keyword Research tool – Price starts from €100 –
  4. SEOMoz Linkscape (Free) and Keyword Difficulty
  5. Google Trends and Google Keyword tool (both Free)

That’s about it if you’re looking to rank in a European or American search engine. If you’re looking to rank in other places such as in Yandex or in Baidu then I suggest you just bring out your wallet.

So what should you do with all these keywords?

Well, now you have a list of keywords that your gut is telling you to use. Always trust your gut, then you look at the data and see what darlings to kill. Now it is time for you to construct your pages accordingly.

Think about it this way. The search engine spider will enter your web page through a link from an external web page or from an internal link on one of your other web pages. Just as when you enter a room full of people at a party. Let’s say the anchor text in the external link said “bananas in pyjamas”. The search engine spider now checks whether or not your web page knows anything on the topic. Just as you would try to find out if the person was a party animal if someone pointed to him/her at the party and said “party animal”.

SO!! Try to group your collected keywords into clusters that make sense together. A page about cars should perhaps also have a link to a page about the engine. A page about wheels should probably say something about the rims. A page about “bananas in pyjamas” should probably say something about kids watching wicked and disturbing TV-shows.

You should aim at making your page as relevant and unique as possible for what you are trying to rank for. DO NOT use your main keyword excessively. Rather, use semantically related words to make the content as good and link friendly as possible.

When constructing a website you most definitively should keep this in mind as it effects the architecture of your web project. Also, nothing should be more than three clicks away. Also, your pages should have call to action over the fold and a nice design. Plus, remember that newsletter sign up and the member functionality. AH!! Don’t forget to add the reviews and ratings, plus all the social buttons. Naturally, do not miss the opportunity to embed your YouTube clips and your rich images. AND remember your page load time. 🙂 Still think that SEO is a walk in the park?

Concluding remarks

Now I am sure you’ll have stuff to do over the coming days and weeks. I am in no hurry so I conduct all my keyword research in the long tail of my Google Analytics installation.

However, if you are marketing director and you happen to stumble upon this post, then you should keep in mind that you have to start thinking keywords when planning your webproject. DO NOT BUILD until you know what you want to communicate. But since no marketing directors would ever read this far into a post, I am pretty sure they’ll continue to order stuff that blinks and make bgsound on pageload.

Possibly related posts:
March 8, 2010

Setting your SEO Activity budget

You’re one of those companies out there who has understood that the web is nothing static. You understand that there is no such thing as “building a website”, but you know that you have to “develop you website” continuously. You get that the search engines bring you about 40% of your traffic and that the SEO traffic seems to convert better than the other traffic.

Good for you, cause this blog post is for you to keep.

Setting an online budget

When you set an online budget you’ll have to break it into at least four parts. One part for infrastructure, one for fuel, one for research and one for development.

  • Infrastructure
  • Fuel
  • Research
  • Development

The infrastructure budget aims at giving you a framework from which you can launch content that users can use in their online activities. The fuel is the content, the media spend and the engagement you put into giving your content the proper context so that people find you and talk to you.

The research part is the funds you need to put aside to stay in touch with reality. You should never end up developing your website for today and thus you have to continuously research the web so that you always can develop for the future. Last but most important part is that of development. You should always develop your website, it should never stop. There is nothing such as a website project with a start and a finish. There is only continuous development. That’s it.

Setting an SEO budget

Your SEO budget falls within all these four parts of your online budget. You need to make the investment in infrastructure so that your content becomes shareable, crawlable and findable. You’ll have to buy marketing so that your stuff is searched for, and you need to acquire links (buy’em or earn’em) so that you rank for those search terms. You’ll have to keep on doing your research as you cannot test it all on your own.

There are millions of blogs and websites out there that share their experiences and empirical tests. Go find them, subscribe to them and then steal from them. Then you have the development part which both includes buying websites, setting them up in your own network, as well as developing new features on your website that makes it worth linking to you again and again and again.

Disposition of budget

The initial investment in infrastructure is all dependent upon where you stand and what type of technology you would like to build in. Always think communication first and then think technology. Never think of it as in “hey, I want to build this in WordPress or .Net”. That’s a not so smart thing to do and it will always make your project more expensive than if you first decide what to communicate and then look at what to use to convey this goal.

1. Infrastructure – 30%

Anyhow, here are some things you need to put a price tag at this portion of the project:

  • Communication platform
  • KPI analysis, implementation of analytics tool
  • Wireframes and Information architecture
  • Function specification
  • Second opinion on SEO work (you always need a second opinion – always!!)
  • Design

I don’t list them all, cause this is what you are used to do when you develop for the web. I would say this part will take about 30% of your budget. If it takes a lot more than that, then you are building too expensively. Most of you just want to publish content that people can interact with. You don’t need expensive stuff to do that. Focus on the communication and the rest will follow.

2. Fuel – 40%

This part of the project is all about your creativity. You can make even the most dull website seem attractive either through changes in the design or by publishing neat and sweet blog posts. This part is the largest chunk of your budget and should focus on:

  • Video, blog posts, images
  • Guides, white papers and tools
  • PPC budget, Guest posts
  • Other marketing activities

You might wonder what the heck this has to do with SEO as nothing seems to be talking about black ops methods, tweaking code and other such activities that you usually hear about when people speak of SEO. Well, I would say 90% of SEO is about producing neat content that people like. Even black hat projects have to be smart and fill a purpose if they are to work on the long term. You have to do your work right to begin with. Then it is all about making it good.

3. Research – 10%

I usually do my research on my way to and from work. I use my iPhone to find new sources for information I spend about two hours a day doing research and I think anyone who want to stay on top should do this. Naturally you think… what the heck… why does he say two hours and then write 10%. Well, even though I don’t work 20 hours per day I think you’ll need to have about 10% of your people’s time doing research.

  • Reading from your RSS subscriptions – new techniques and link bait tactics
  • Brand watching your own brand
  • Watching your competitors and how they move in the serps
  • Reading patents from Google, Facebook, Twitter, Bing, Yahoo

I am sure there are other ways to do your online research. Especially if you are willing to go deep into investigating what’s in the hidden web. But that’s an article on its own.

4. Development – 20%

You should never stop developing your website (huh.. have I said that before in this article). You can do A/B split testing through Google or you can do multivariate testing through your analytics tool such as through Test & Target from Omniture. You can develop tools for your users that they can use on their platforms and you can tweak your navigation, sub pages or internal linking so that any user can reach all your content with three clicks.

You can develop some new display of data or let your graphical display evolve. The most important thing isn’t what you do but that you do it. You cannot stop developing your website. If you constantly develop then you’ll be ready when you’ll have to adapt quickly. There will be problems up the road that you cannot foresee, and opportunities that you certainly don’t want to miss out on. If you work with continuous development you’ll be much better situated than the others that don’t.

Plus, you’ll get links, better rankings and a website that seems to be alive. Such websites are more intriguing and exciting to return to even for a bot crawling your stuff. If they see that you change often, then they’ll have to return often.

This portion also includes test projects that you put up on other domains before you launch them on your own website. You should buy one or two domains every month just for the laugh of it. Put up projects on these websites that you are interested in testing for your main website and see how they affect your rankings, your traffic and your general brand volumes and activity.

Concluding Remarks

The above statements has to be amended with each project but it gives you a rough estimate of how you should work with your online budget and in particular your SEO budget. If you get one thing from this post, you get that you need more than just build a thing and put it out there. You need to do stuff with it in order to make it fly. yadayada…. 🙂

Love ya.

//Jesper

Possibly related posts:
March 7, 2010

Thoughts on SEO Future and how to rank on a realtime web

I got asked on Twitter about what this article from Read Write web would affect search engine optimization in the future. I haven’t written anything on SEO for a while so I thought I might take the opportunity to write a bit more extensively on what I believe to be the most important aspects to take into account for the coming year.

1. ATOM PuSH not really the issue here

The article talks about ATOM PuSH functionality that would help you as a publisher to get your content indexed faster in Google. This will not in itself change the way you rank in the search engine. You still have to have crawlable content, it still needs recommendations through links and it still needs to be well written.

What it really changes is the fetching method. Up until now, Google have been fetching content with regards to relevancy. The article however, says they are to start fetching content based on how new it is. I don’t know if this will effect the search result pages in any significant way, but one could possibly assume that Google will rank popular content, ie. content that is subscribed to by a lot of other people, as something that should rank higher than other content which is not.

2. Change over Time

In the beginning of SEO it was all about volume. Nowadays volume is only one property to which Google evaluates your content. The property I believe will grow in importance the most over the next couple of years is that of change over time. Change over time is the only way at which you can look at a piece of content and determine its relevancy on a real time web. It is the only way of finding spam and it is the only way of determining which content is the most relevant “right now”.

Actually. What we have to look at is the second derivative of motion. Is the sum of something accelerating or is it decelerating.

Think of two pieces of content. One which has a lot of links, and one with lesser links. Think of the one with lesser links getting one link the first day, two links the second day, three links the third day whilst the one with more links is getting one link each day. Let’s say the content with the lesser links was newer than the one with more links. Which would you think was more relevant? The one accelerating with one link per day, or the one receiving one link per day? Tough question right?

Let’s say we add Twitter to this. The publisher of the content has 1500 followers and the content got retweeted 50 times on the day of release. The older content didin’t get retweeted at all as it was produced when there was no Twitter. The relevancy factor or news value of the story then becomes 50/1500 (my own kpi :)). With regards to all other content out there. I am getting sidetracked. But what I am getting at is that I believe Google will look more and more on change over time as it sort of shows how relevant a piece of content is at this place in time.

Thus Google will index a content higher in the serp if it comes from a resource that usually get shared. They will do this over a time period to see if this content also was as good as the old stuff. If it doesn’t receive the same change in time links as other content in the vertical or serp, it will decrease in value over time and fall in the serps. This is somewhat a contradiction to the “authority” paradigm that is being talked about in SEO circles.

In the future, authority will only give you an edge, as it will put you there on the first page to get “tested”. Yet if your content does not pick up speed, it will decrease in value over time. Regardless if your site in volume is much stronger than other websites.

3. Localization

There is a localization revolution going on in the serps right now. Search for “lawyers in washington” and you’ll see what I mean. If you have a physical presence and an address listed in Google, you will have a much better visibility than those who don’t.

Add to that the GEO-tagging possibility in the Meta of your web pages and you’ll be able to have an edge over anyone with a less local touch to their business.

The address helps people find stuff in a certain place, the GEO tag helps them retrieve stuff that is more relevant to them given their geographical presence.

Naturally, localization of the serps have everything to do with Google wanting to become as dominant offline as they have become online. I am looking forward to seeing the results of this. 🙂

4. Socialization

This is a change I don’t necessarily like or enjoy that much. In the future my search results will become dependent upon my interactions with others. This might play well if it plays within a certain time frame, but if they do as they have done with the personalized search I am not that enthusiastic.

I would like Google to look at sequence within a session rather than look at what I have searched for or talked to my friends about “throughout all of time”.

Socialization should be a factor as we do trust stuff from people we know more about than those which we know nothing about. Hence the assumption would be that we would find content more relevant if it came from those we have a relation to. However, I don’t believe that this is entirely the case. It might just be so that our serps will get overhauled with stuff we don’t want to see, just as blog posts have killed information searches to a large extent.

Concluding thoughts

Well, I am unsure if my brain has woken up enough to make any sense, but here you go. These are four of the most important aspects I see becoming more and more relevant when thinking of search engine optimizing your webpages.

Possibly related posts:
January 7, 2010

How to do a .htaccess 301 redirect

This article is a part of the redirect series I am creating. To find the full series read this article and find a link to the full resource at the bottom.

Duplicating your content to the new location or new domain name

When you’re about to redirect your content to a new server or to a new domain you should not mix it up with changing your file structure at the same time. This is very important. If you change your domain name and your catalog structure at the same time you will have a whole mess to cover, especially if you are moving servers at the same time.

You should redirect one step at the time and carefully disect whether or not additional changes have to be made. Perhaps you don’t need to do something, but it would look neat. Please keep those changes to the very minimum and think of functionality, SEO value etc. before making your changes.

It is all dependent upon situation, but if you don’t want to loose too much in rankings you should copy all the files, the file structure etc from the old place and upload it on a 1 to 1 basis to the new place. This means that you should put:

  • http://someolddomain.com/someoldcatalog/someresource.html

to

  • http://somenewdomain.com/someoldcatalog/someresource.html

Don’t change the catalogs at this time. If you desperately have to, then sure. One instance where it actually makes sense is when your catalogs don’t make any sense in the url. Otherwise, drag all the files from your old server to your desktop, then upload them to the new server.

For those of you running databases or dynamic pages, you should export your data base and upload it to the new one (if not the same one) and change the data base connection configuration on the new installation of your web platform so that they fetch the same content in the same way as on the old platform.

Creating a .htaccess file

The .htaccess file is a file that basically tells your visitor server/computer/user the seccurity, redirect, url etc structures of your server/files/resources etc. It is the first document that the visiting client fetch when they arrive at your website. If you log onto your webserver via ftp or something similar you might find that there is no .htaccess file there.

If this is the case then you should go ahead and create one. Do it in Notepad or some similar code editor so that no crap or extension is added to your file. Ie. it should not be .htaccess.txt or .html or .xml or any other extension. It should simply be .htaccess. Save it as .htacces – don’t forget the dot. If you already have a file on your server then download that one to your computer to edit it.

1. Redirection of Static pages

  1. Place this code in your .htaccess file:
    redirect 301 /old/old.htm http://www.newdomain.com/somefile.html
  2. If the .htaccess file already contains content, then scroll to the end and make sure you leave one line between the old code and the new code
  3. Save the file (still no extension)
  4. Upload it to the root file of your server
  5. Go to a standard web browser
  6. Type the old url
  7. If you’ve done this correctly you should end up in the new location

Usually, the main problem is that you haven’t uploaded the .htaccess file to your root folder or your .htaccess file has an extension or you have forgotten the dot, the ., the [.] the ………….. THE DOT!! Don’t forget the dot in front of the htaccess.

Locating your root folder

You should always put your .htaccess file in your local folder. The reason for this being that it is where your server looks for it when another server asks for it. The root folder is usually the one you see where you only have a / in your browser window address field. This might vary however. Some web hotels have a www_root folder in which you find your files, or they have a htroot, public_html/ or some other root file. My suggestion is that you either ask your hosing provider or test/fail/succeed.

I find my root file here through my ftp.

2. Redirection of Dynamic pages or Entire domains

When redirecting pages that are fetched from a database you should do this in a different way. Basically use the following code:

RewriteEngine on
RewriteCond %{QUERY_STRING} ^id=[page-id-nr]$
RewriteRule ^/page.php$ http://www.example.com/newname.htm? [L,R=301]

You will have to replace [page-id-nr] with the number of the page you are redirecting. Ie. if you have some page with the url http://www.example.com/page.php?id=256 such as the normal id’s for a wp-blog looks if you haven’t done anything with the permlink structure. Then the code above should look like this:

RewriteEngine on
RewriteCond %{QUERY_STRING} ^id=256$
RewriteRule ^/page.php$ http://www.example.com/newname.htm? [L,R=301]

I know. Now you’re thinking –> OMG WTF Do I have to do this for all my pages. And my reply is YES IF you are changing everything at once. BUT if you’re keeping the same file structure but only changing domain or server then you can skip the above and simply put the code:

redirect 301 / http://www.you.com/

If the above doesn’t work for you, then check if your .htaccess file has the proper name, the proper extension (meaning NO EXTENSION), and that it includes the ., the dot, the [.] (can’t stress that enough).

Then try this alternative:

redirectMatch 301 ^(.*)$ http://www.domain.com
redirectMatch permanent ^(.*)$ http://www.domain.com

If this doesn’t work then you should e-mail me and I should update the post.

What to think of when changing server or/and domain name

If you are like me and you have an obsessive compulsive disorder against duplicate content. Then you should do the following whilst you are working on your redirection process. Let’s push back a couple of steps and take it at a macro level from the top.

  1. Block the new domain in your robots.txt file (you can do this from your webmastertools central)
  2. Upload the old content to the new place
  3. Make sure your old content is visible on the new place by typing in the new url or by requesting the new server
  4. Make the redirection
  5. Remove the robots.txt block from your new domain

This is important, but not the most important thing to do. If you want to make sure Google doesn’t find the content on the new domain then you should noindex it until the redirection is in place. There are some “creative” reasons to do this as well, but I’ll tell you more about that in another article.

There is a module called mod_rewrite on Apache servers. In another article I will cover how you can use your mod_rewrite to do this very easily and powerful for yourself and your redirection process.

This article is a part of a series of articles covering redirections. Once the full tutorial is done you can find it under the Search Engine Optimization tab.

Possibly related posts:
January 7, 2010

Different variations of redirects, 301, 302, 303, 304 etc

For a complete list of HTTP status codes you can read further either visit the wikipedia article on the topic or you can go to w3.org Status Code definitions for a complete assembly of HTTP status codes.

The purpose of HTTP Status codes

Basically HTTP Status codes are used for computers to understand in what way they should talk to each other. It is like the computers way of making discretion between a snobby lady and a street kid with a baseball cap. It might still not work, but you are a lot better off if you request a service from them in two radically different ways.

On a more technical note, HTTM Satus codes tell the two computers chatting with each other in which way and from where the two should interpret, fetch and forward the information they are processing from eachother.

301 Permanent redirect

The most common use for redirecting users from old places to new ones is through using the 301-redirect which basically tells the server that some content has permanently moved from one place to another.

302, 307 Temporary redirect

This was previously debated to be equal to the 301 redirect with regards to SEO purposes. I think the debate is killed nowadays as it is pretty evident that the 301 works a lot better, even if temporary. Basically it tells the requesting server that the content is there, only just not now. It also tells the server that it should continue to request the “old” place in the future as well as the content will return to there in a while.

The difference between the two being that you shouldn’t use the 307 as it is not understood by many agents. (simple ehy :))

303 Use GET method to retrieve

This is basically telling the server to use a specific method to fetch the information on the server and is not applicable in this exercise, but it is important to keep in the back of your head if you EVER would like this to be the case.

305 Use Proxy to fetch

The 305 redirect tells the agent that the resource requested must be fetched via proxy.

This article is one in a series focusing on redirects. To view the full list of redirect blog posts visit the Search Engine Optimization tab.

Possibly related posts:
January 7, 2010

Why Redirect old content when Changing domain or Server?

If you are looking to change domains to your website, change the order of your file structure or simply change the url to one of your pages. Make sure to remember to create a redirect for that page.

Loyalty reasons to redirect content that is no longer there

Redirects are important primarily when you have a mature website that is indexed in Google or any of the other search engines. Given that you have some readers and your content is valuable to them, it is likely that they have stored links to your website in one place or another. It might be through a social bookmark, a bookmark in the browser or through a link on their blog or website.

When you change the urls, the old urls simply stop working if you don’t create a redirect that directs the old links to the new content.

SEO Reasons to Redirect old content

For Google, Yahoo and Bing that put relevance to how many people/websites have linked to your page, it also becomes highly relevant to store this value. By creating a proper redirect you make sure that you store most of the value these links gave to your old website.

With regards to Google, it is important even if you are on a new server, hosting provider or similar to create a redirect from the old server to the new one. Remember, that the neat domain names aren’t what the computers see when they fetch your website from a server. That is only what the browser returns to you in the url field on top of the browser.

Spammers use this method in some creative way and I will cover that in a future article.

Remember to also do this when moving to new hosting or servers

The computers see something like http://201.214.5.63/root/cat/?pid=546 and convert it into http://www.yourdomain.com/acatalog/somepage.html. When on a new server, but with identical domain name and structure, the computer fetching the content at the old place http://201.214.5.63/root/cat/?pid=546 won’t be able to find it as it is no longer there. You should thus  create a redirect even if you don’t change the name but change the location of your website.

This article is one in a series focusing on redirects. To view the full list of redirect blog posts visit the Search Engine Optimization tab.

Possibly related posts:
December 2, 2009

Every Link comes with a Person(-a)

personality1After a searchmeet conference with loads of talk about links, I find it quite nice to kick back and write some about the social basics behind natural linking on the web. Cause links, as they are used, are more than a connection between two pages on the web. A link comes with a person, or more accurately it comes with a persona.

So, how can I say this? Well, basically this statement is two folded. If you recieve a link from someone they actually do more than connect their page to one of your pages. They connect everything that’s been said on that domain historically, along with all that will be said in the future when passing the link love along.

Secondly, it is allways a person linking. Even though there are some pretty neat spam tools out there, the common link is an action by a person, with a reason to use the link as a way to put a story into a context or mention what they are linking to with a good or bad reference.

Co-Citation

Many of us out there have read Jim Boykin’s Co-Citation post. I remember finding a link to it when reading a post written by Swedish SEO Jim Westergren sometime in 2007. What it says is basically that your website is effected by what neighborhood it is connected to. Just as we get judged by the crew we hang with when out in town a dark winter night. Google cares about which pages your web pages hang out with online. The way they judge your crew, and thus, what to label you with, is by watching what pages you are linked together with.

LSI/LSA – Latent Semantics

What this theory basically tells us is that in order to be an authority on a topic, you’ll have to know a lot about it. If you are an authority within a niche of a topic, then you’ll have to know all about that niche. Basically, LSA looks at what you know, if you know the details, the slang, the synonyms and if you can connect them to all other information in related topics. When Google uses LSI I suspect they have a look at first, what you write. Do you cover the full topic? Do you spend time on linking to individual pages about more detailed descriptions for complicated terms? Do you vary your copy to include all different synonyms and do link to the most authoritative page outside your own website when the info cannot be found on your own.

Semantics deal with meanings. If I write, “break a leg”, then that can mean a lot of things. It can be a wish for good luck, but it can also be a curse for bad luck. Google, and semantic analyzes takes aim at knowing when you mean which by determining the context in which you use the phrase. That is what LSA is all about.

A Person Links because

minimeOn the modern web the link to your website is shared amongst users if it leads to a valuable resource. Value can be determined in many ways, but in the end it is all about what the reader thinks is valuable. People will link to your content if they like it or if they think that you have said something important, or well explained enough for them to clarify what they themselves are writing. You complete them… to use words of Mini-Me.

Regardless of why a person links to you or not, a natural link comes with a motive. That motive might be praise, recognition or just simply a way to say that “you explain this better than I so I won’t even bother”. A link might also be a thank you for something you’ve done and it might sometimes be a sign of the fact that you belong in the same crew as another person. Blogroll links are commonly used to say; “we write about the same stuff”.

In the Eyes of Google

As your link profile starts to build, all the people linking to you becomes the mirror of your face online. Your persona builds from the way that others are including you in their multidimensional Internet. For each link your dependence to the next link decrease whilst your importance to the previous link increase. Your persona, and thus the personas of all the people who has previously linked to you, change as you get new links putting you in new light of the search engines.

In the eyes of Google, I suspect, your complete profile is a constantly changing process. Just as you grow older and gain wisdom, or idiocy, your link profile is what determines your position in the eyes of others. Have you been a “good” person, then you will probably have a “good” legacy. People will have linked to you because of your impression on them or because of your knowledge.

Google recognize this. They also recognize that a one time star, can fall if they break a trust. That’s why you have to build your links over time, at a constant and increasing flow as your reputation spreads all over the web.

What am I really trying to say

Well basically I am trying to say that the natural link exchange on the web, just as the relationships between people, performs just as judgmental as society. If you hang out with the wrong crew, or if you say the wrong things, then you’ll have a much tougher time making advances in life.

In the eyes of the search engine, you’ll have to keep the chin high, and not just leave a one time gigantic footprint. You have to care for your relationships and inbound links over time. You have to nurture where you get mentioned and how. You have to exercise the kind of judgment you would do in real life.

LSI and Co-citation tells us that it doesn’t matter if we wear a suit, if we hang out with gangsters. It is the association we bring that is important to how people will judge us, our websites and what we do online.

//Jesper

Possibly related posts: