• Background Image

    Digital Marketing Blog

    SEO

September 24, 2013

Google removed Keyword data from Analytics – 101 ways to get it anyhow!

Keyword data seems to be disappearing from Google Analytics. So. How do we know what to optimize our websites towards? How will we be able to know what supporting keywords to use in our texts? How will we be able to run keyword targeted conversion optimization on the organic traffic we have earned.

No worries. We’ll get to that. Obviously Google says they want you to work with your website, to increase your quality and better serve users with better content. However, you will now have to resort to paying some in order to get the juicy stuff.

Below is a set of tools that you can use in order to know.

1. Wordstream is the best keyword tool out there – use it

  1. Wordstream is a tool that helps you to create Adwords keyword groups for your content. They help you find the most profitable niches and are available in many different languages.
  2. I suggest you create a subscription account on Wordstream. It will cost you some money, but it is surely worth it.

2. How to bid on Adwords to get keyword data

  1. Use Keyword tool – generate dynamic tracking URLs
  2. Do not bid broad as you will lose a lot of data doing so as your exact match will not be displayed in your account
  3. If you are simply interested in search volumes – use Google Keyword tool AND bid with minimum bids on all keywords you can possibly think of
  4. Bid narrow and use negatives.

3. Activate internal search tracking in your Google Analytics

  1. I find it awful that so few companies have activated their internal search tracking. MORE than 20% of the people on a website use search onsite. You will get the exact match on the keywords to optimize for PLUS deliver a brand building experience to your users if you work on your internal search.
  2. To activate it. Open the profile you want to track > Click Admin in top right corner > Click View Settings > Enable Site Search to be tracked at the bottom of the General settings page > The parameter that you need to enter is the search query generate in the URL when someone searches for this blog – if you search for “search” – http://jesperastrom.com/?s=search – the parameter you would enter would be ?s=

4. Download the “Not Provided” kit

  1. This was a kit – actually an analysis framework – that one of my former students Markus Frick gave me a tip about on Facebook. Download it from here.

Ok. So that’s 3 ways. I know you got one to add. Comment away!!!

//Jesper

October 21, 2012

Mobile Cloaking in the coming years

Do you have a mobile CSS on your website? I think it is time you’d better get one. Especially if you work with lead generation, sales or any type of transaction on your website. Mobile traffic is increasing like a mother fucker and you need to be prepared for it. Are you still asking the question of whether or not to build a native app or build a mobile website? Don’t. Build a website. First. Then see if you have the need for an app.

What is cloaking?

Going back a few years in time, a practice was used to deliver one content type to the visitor and another to the Google bot. This would make Google believe that your web pages were filled with valuable text and image content that was highly relevant for some keyword. Whilst, in fact, your page, when served to a normal user, had all but copy on it. It was simply a cluster of your most converting content put into a structure where it would convert the most possible visitors.

Webmasters did this so that they would rank for stuff their pages didn’t really deserve to rank for. This practice was called cloaking.

Google solved the problem by further developing their algorithm so that it would detect this practice. They also added manual checks of websites in order to rule out any sneaky bastards who had been able to reverse engineer the new changes to the algo. In a sense, cloaking is a part of the past, at least when it comes to PC/Mac to web interaction.

Mobile cloaking is here

However, as the web has expanded to also include delivering access to mobile devices our cloaking skills are experiencing somewhat of a renaissance. People are to a greater extent picking up their mobile to search for whatever they want to find. From our analytics we have seen mobile usage, of the websites we monitor, increase gradually. For the past few months, this traffic increase has started to accelerate.

So, what should you do with this traffic when it comes to your website? My suggestion is that you cloak the living shit out of it. 

So what do I mean when I say “cloak the living shit out of it”. Well, basically I am talking about offering another interface for your mobile visitors than you serve your computer browser users. Fair enough. But I mean seriously change the way it looks in the mobile. Users are lazy. Remove the unnecessary stuff. Give them only the essential for conversion. Either to a lead with an e-mail form, or through a simple one click to phone number purchase. Google doesn’t seem to penalize you for this. Perhaps, because you are actually delivering a better page experience than the one with all the text on it.

What is your Mobile cloaking opportunity?

This will not only improve your conversion rates, but it will also improve your usability for the visitor through the mobile phone.

And it is here that Google’s dilemma becomes escalated. They will probably never be able to penalize you for these differences in delivery. Users will not like pages with loads of text on them in their mobile phone. Thus, as Google has the ambition to deliver great content, they will have to adapt to finding a way to rank good mobile pages. The only way they can do that today is to use the web css delivery, regardless if you serve your visitors with something else through the mobile phone.

From all I can derive from the numbers, Google will rank you in the cellphone based on how your webpage is behaving to a “non-mobile” visitor. It will give you some advantage for localized content as well as for mobile CSS. HOW this mobile CSS displays the content on the other hand, seems to be of lesser significance.

People are searching the web through the phone. Ok. The search behavior is a bit different. It pays to keep an eye out for what Google suggest is giving the users as options for your specific niche keywords. Use it. People are lazy and more prone to use what is offered to them. Either way. You should be seeing more traffic from the cellphone. Both social as well as search traffic.

So. That’s an opportunity for you.

 

October 11, 2011

Black Hat SEO and how to do it

There is this thing referred to as Black Hat SEO. The definition is always debated amongst search engine experts. In Wikipedia they define it as the deliberate manipulation of search engine indexes. If that was the case, then all SEO’s and companies wanting to advance in the search engines, would be black hat SEO employers.

I define Black Hat SEO as activities where you automatically steal, hack, generate filthy, link drowning content to support your greedy ambitions in the search engine result pages. Practices such as content scraping, cloaking, comment spamming, database mining, pingcrawl, auto submitters and naturally – hacking of exposed servers, falls into the category.

Beginning my Black Hat Tools journey

I have previously bought links. A practice I recognize as gray. I have never hacked a website, or employed automated link building tools to create serious link influxes to my websites. Yes I have actually automated social bookmarking, but not the nasty kind.

From today, that will all change. I am tired of not knowing, and will start a journey to research which of the tools out there work and which don’t. I have set a budget of $10,000 USD. My goal will be to purchase tools until that sum is spent. Here are my rules:

  • The majority of content on the website shall be auto generated
  • All back links shall be built automatically
  • I am allowed to engage with users in social media, but I am not allowed to generate the content myself
  • The domains have to be new
  • I am not allowed to break any laws

What tools shall I test?

I have a short list of approximately 10 tools including some I am going to order custom versions of. I suggest you send me an e-mail to jesper.joakim.astrom@gmail.com if you want me to try out your service or if you know of a service you think I should buy and test.

Why do I want to try Black Hat tactics?

First of all I want to know if they make any sense. Can you still spam your way to the top in Google? I want to learn every detail of executing them. I need to know. Then I want to share this knowledge. What works shall be put into the toolkit. What doesn’t work shall be told. If we are to progress, we need to research, investigate and examine.

Plus. I think it will be loads of fun.

Feedback. Anyone?

September 27, 2011

Reading level and SEO

If you want your texts to be readable by many people, you need to write in a way that people understand. Simple enough, right? The easier to read, the more accessible. Simple is better than difficult and short is better than long.

I have regularly fallen victim to my own ego when writing tutorials extending them beyond the limits of what is readable on a regular screen. Thank God for mobile devices making long texts readable again.

In the eyes of Google, the texts on your website are basic, intermediate or advanced. That is, the reading level of them. Yeah, Google actually tries to understand your texts and then categorize them under basic, intermediate and advanced labels.

This is a natural evolvement as Google wants to rank the most relevant content in the top of the search engine.

If a user cannot understand the content Google serves, the search result will be less relevant than one that the user actually gets.

Write for the level of your audience

But hold on. Traffic is really just one aspect of SEO. Don’t rush out and rewrite all of your texts just yet. Traffic might be lower as a result of getting only people with an advanced reading record to your website, but  perhaps it doesn’t make your website less relevant for these users who actually end up there. I guess the thing here is to know your preferred/converting visitor demographics/sociographics and write for them.

Consider a camera retailer. You are selling cameras and you hire the photographer nerd of a lifetime to write your copy. The person will write about the camera knowing all the right jargon. Have a look at this camera description from Amazon below. (click on image to get to site)

The standard camera buyer might not know all the widgets, gadgets, settings and data that needs to be in place for the perfect shot. They might be searching for terms such as cool cameras. As on many camera websites, the only info you get about the camera is this info. I would say it is advanced, and so would probably Gooogle. Thus, although it is long tail, it will probably have a difficult time ranking on its own if there is a page with both this information and less difficult info to grasp. Ie.

  • Two pages with the same technical description mumbo jumbo
  • The one with the jibber-jabber, extra text with simplification will get more traffic, even for searches of the tech terms

Amazon realizes that it doesn’t have the resources to rewrite all product descriptions in its database and thus it allows for user reviews where the product usage and likability can be discussed. Naturally, the more common the language, the tougher the competition (hopefully) and the more potential traffic. By using both you are covered in the advanced long tail as well as in the common peak.

Now consider poker affiliate, a perhaps more clear example.

  • Who is the user that will convert into a regular player?
  • Is it the experienced player or is it the person who have watched high stakes poker on GSN on a weekly basis?
  • What will they search for when they want to convert into a regular player?
  • What is their terminology and what level of understanding do they have of the content you offer?
  • Do they really understand the meaning of shove, felted, three bet or position?

Perhaps they are on the level of flop, turn and perhaps river.

When hiring an experienced poker player you might get the attention of the loyal poker elite, but you might not get the attention of the beginner. They might on the contrary be scared by your advanced level and think that it is not for him or her. The reading level of your content thus becomes significant not only for the kind of traffic you get to your website but also what kind of conversion you get out of that traffic.

Does this “REALLY” matter for rankings in the search engine?

So, does reading level have anything to do with SEO and the results in the search engine? Google’s latest update called Panda seemingly punished those who use complicated or more advanced texts and premiered those who used easier language.

The job of an SEO has traditionally been to collect links, and it will probably be the core of the work duties for quite some time on from now. However, since the Panda update I have several tutorials that rank the living shit out of websites that have four or ten times as many links as I do. Perhaps not because the level of the tutorial, but more likely because of their length.

I believe, as I have often argued, that many SEO’s need to update the way they target themselves. From traffic, to conversion rates. Conversion rates are the true measure of content quality and should thus be used as the main metric for relevance. I think that if we shift into measuring conversion rates in front of traffic increases, the content quality – reading level, will improve as a result.

How to find the reading level of your website

In order to find out the reading level of your texts you should write site:yourdomain.com in the search field of a Google search engine. This will list all the pages of your website that Google have indexed.

To get the above percentages you also need click the reading level link which can be found in the left hand navigation of the new search GUI.

//Jesper

 

March 29, 2011

Holistic SEO for the Data driven web

I see a link to an article or blogpost bashing SEO as a practice just about every week. I have started to care. In the beginning I didn’t, but considering I hear so much bs from so many people right now I have to write something.

SEO is only partially Google optimization. It is about Facebook, Twitter, YouTube and all other websites where you really should be present if you want to be visible on the expanded web. But this article is not about that either as both myself and others have written about this before.

This article is about us humans and why we search. It is also about why search will always be a factor regardless of what type of technology we use. Third, this article will be about how to move into a new domain of search that is data driven rather than URL-bundled.

Understanding the Human searcher

When we produce content, regardless of channel, we have to consider the domain and the properties of the community in which we publish it. People search in different ways depending upon where they search. I usually recommend companies to go search for similar content to what they are producing before giving it the proper title. They should record every search they make until they find what they need and want.

They should then do a standard volume vs competition analysis and produce content for the proper keywords of that content. If they want it to be found by relevant users, they should also consider what the searcher is searching for, naturally. Most companies don’t. But they should.

If you publish a video to YouTube, don’t forget to use the word “video” in your title as people use the word “video” when they search for video. If you publish an update to Facebook, then remember to add a question or statement to it so that people have a way of responding too it, and thus leveraging the power of social network interaction. Remember that the perspective of a social network searcher is rather “what is my friend doing” than “cheap sunglasses discount coupon”. This means that the searcher of the social network start by searching for a friend, ends up on the news feed where they see a comment made on your wall. They follow the link of the comment and reach the discussion of the wall post. They get indoctrinated in the types of words used and follow links that seem to be the conformed consensus of the active respondents.

If you publish something to Twitter, then try to own a hashtag as these types of “back channels” are used in order to categorize content and give different 140 character expressions a semantic context. If you then push something in a forum, you can use the hashtag so that people will search for it to find out more about the topic you are discussing.

To understand the human searcher is to understand purpose. There is no search without a purpose. Your greatest mission in life as a content producer is to use all assets available to fill the purpose of those searchers that search for content which is relevant for you and your business. The human searcher does not roam, they want to find salvation to their perceived desire. This will never die. This will never go away, and it is the very core of what search is all about.

Thus when optimizing content for this purpose we need to begin with the humans purpose in mind.

Technology is not the biggie

When thinking of SEO in a holistic way, you need to understand that all your activities online AND offline matter. There are simply too many signals telling search engines and social networks where your content should be placed with regard to other content.

Search engine optimization is technology independent as your main purpose cannot be to produce content that matches all these signals. Especially when considering you probably don’t know which they are and what they will do for your content. Your main ambition should be to optimize the flow of information so that it aligns with your purposes.

This means that you should probably offer ratings and reviews on your website. If you cannot do that, you should feed it from another website onto yours. Your payment is a link and your sacrifice in Google ranking for the competition intense keywords. However, by adding the content you are also relating yourself to the information and as long as it is relevant, you will become more relevant for the end user and thus increase the likelihood of them returning.

So what does content satisfaction and aligning content into sets have anything to do with independence of technology. Well, in the end it is all just content displayed in different formats. However, as more and more of content formats are becoming standardized or easily converted through simple processes that you can automate, the technology you can use is becoming of less relevance. Regardless if you publish to your own platform or if you publish to another place such as Facebook.

It is the content that matters, not the technology that contains it.

Technology should only be used when it simplifies the challenges of communication, not because it is the new black. If you want to optimize your content for a technology such as Google, and at the same time disregard other technologies, then I feel you are making a big mistake. However, if you focus on the purpose driven end user, you will probably gain a lot more.

If you don’t have the content that will make you found, then align yourself with someone who does. Co-ordinate your communicative efforts with what’s out there already and then take advantage of the relationship a complimentary position might offer.

Which brings us to the Data

Cause there is nothing as exciting as considering data in closed environments, sourced from open environments. Naturally I am talking about sites vs apps. In the coming few years we will see the death of the URL and the complete explosion of the dynamically generated response app.

Today, searches are limited in their relevance as most of them return URIs from an index and list these URLs in a list. The content that becomes available through these resources is created before you search for it and thus it is most likely not taking into consideration what you plan to do tomorrow.

The data driven search engine optimizers must understand this. In the data driven search, made through an application, the user will ask questions such as “restaurants I like in Gothenburg where 2 of my friends have been and that is similar to that one restaurant in Sundsvall”. A search engine of today have NO possibility to reply with a specific content meeting your demands of information. However, the data driven web will.

They will source information from all of your friends, all image websites, all review websites and all other such resources needed to understand what you “like”, which of your friends have been at a specific venue and if they use the same furniture and menus as the other place you want it to resemble.

This is why I say that you need to align your content with other relevant content, and thus relating it to networks of people, brands and other such domains that are necessary for you to become findable by someone who has never met you, yet has a lot of opinions about other things similar to you.

The Holistic approach to SEO

So, as you can see from above, this was not the Holistic thinking that is usually put out there. My main point with this post is that you should focus on the humans doing the searches, their needs rather than their preferred technology and then understand that the data driven web will truly change the game as users will ask for more than what you want or are able to give them.

A holistic approach to SEO doesn’t take aim at anything. Rather, it takes aim at being findable by relevant Humans in everything. Regardless of technology or current new black.

November 16, 2010

Flash is Google’s friend, only the Users left to convince

Voices within my network were raised earlier this week when Google published they have managed to further crawl flash websites. In short Google is proud to announce that they are able to crawl text within Flash better than before. Nothing wrong with that. Actually, there is nothing wrong with Flash at all. It is the misuse by retards that is the problem with Flash. You know, Flash don’t kill websites… people kill websites…

To few people have been taught how to properly include Flash files. People are also lazy and thus don’t care to learn more technologies than one. This leads to an excessive use of Flash where CSS3 and HTML4 (not the over hyped and still not accessible HTML5) do a better job. The benefits of jQuery shouldn’t be exaggerated either. jQuery is wonderful when used in a user friendly manner.

What makes me tick however are the inconsistencies in Google’s message here. I agree with Hessam’s comment (scroll a bit amongst the comments or Cmd+F hessam). The biggest problems with Flash aren’t solved just because Google indexes the websites. Flash still:

  • Renders horribly, if at all in mobile devices
  • Loads slower than I take a shit
  • Has a tendency to give too much freedom to designers giving users a series of ADD experiences

However, Flash is pretty darn shiny if used correctly, with a fallback and with a Mobile, Print and web CSS. But now I am missing the point again. Let’s go back to the inconsistencies in Google’s communication. Have a look at the first 10 minutes of this 60 minute I/O talk from Google.

Don’t they repeatedly say: “There is no text on this website”?

Same goes for all Flash websites out there. Regardless of what’s in the SWF file, these websites have no text on them. And regardless if Google can crawl the website or not, it takes them a hell of a lot more time to crawl a heavy Flash website than it does a standard HTML website.

But HTML websites tend to look so pale… bah…

But what do the users like? Let’s have a look at the top 50 most popular websites online.

Users like:

  • Google.com
  • Facebook.com
  • YouTube
  • Yahoo
  • Microsoft Live
  • Baidu
  • Wikipedia
  • and it goes on…

Users don’t seem to crowd to design heavy websites do they? What are the users looking for then… if they are not looking for good looks? They are looking for good content. Looks are secondary. Most designers get it wrong anyhow. They want the “picture” to look nice. Not to be functional. In Flash you can make the prettiest things. But they are a pretty image. Not a functional website.

If you design a good looking website based on the full image of the website rather than the parts. Then you do better in drawing a painting. However, if you make each one of the parts look and work well, then you have the correct components to help the user USE your website.

Regardless how good looking your website is, you need to have users actually wanting to use it. Not only look at it. Especially with the modern and social web. You NEED to have a website that is not only indexable in more than one way, but you also need users to interact with your web content in order for it to organically get shared between users through their different networks.

Anyhow… I just needed to be a part of the discussion…

Possibly related posts:
October 23, 2010

Content Governance

One of the biggest problems facing large organizations today is that of content. It might be a bold statement to make, but after consulting, working for, and producing for a series of large corporations I have seen patterns in their challenges. I have now managed to mold these different problems into 5 categories that I put underneath a common umbrella I have chosen the name “Content Governance” to denote. These five areas are:

  1. Content lifetime
  2. Duplicate content
  3. Content ownership
  4. Content findability
  5. Content shareability

1. Content lifetime

Content lifetime is something that has intrigued me for the past few months. I have been trying to find a way in which I can effectively measure how lasting a piece of content will be in the early stages of its life. The common denominator for all long lasting content I have found is inbound links.

Stuff that receives a lot of inbound links, maintains its popularity over time. Not only in the search engines as one might assume, but it is also shared over longer periods of time in Social media. Content lifetime is central to all KPIs online today as it is not the content itself, but how it is used that effects how successful or not we are online. Especially if we still work in short campaign cycles where we need to increase the reach over time parameter in all our campaigns.

Content lifetime is also effected to some extent by the type of infrastructure or marketing it has to support it. Not all campaigns go viral, yet they can be highly successful non the less as they manage to target the correct audience. Word of mouth travels outside the web and is, as I have argued previously, also a result of product delivery according to promises stipulated.

Broken promises is one of the worst things you can be a part of as an online business owner (if they don’t increase your inbound links a lot). Broken promises almost always kills your content on the spot. Regardless if it is a blog competition or a white paper people can download. See, once you start breaking promises, your infrastructure will become more costly to maintain and thus will not only hurt your capacity to manage your content over time, but will also increase the lifetime of your content objects. Why, you say? Well, your content will only live as long as it does not encounter your rumor. Depending on the density of the distrust, your content lifetime will be. (wow… Yoda sentence…)

2. Duplicate content

There are many forms of duplicate content, as well as reasons to care about it online. The two most common discussions regarding duplicate content appears when discussing SEO practice and social media technology.

Duplicate content – SEO challenges

  • How do you avoid duplicate content and meta on your own website?
  • How do you ensure that your published news origin source appear in the search engine result pages for related searches?

Duplicate content – Social media challenges

  • Cross technology activity
  • Time zone coverage of content

This discussion becomes quite apparent when you analyze multilingual content on several top level domains (TLD). Let’s say you have a corporate policy that says you should offer all your services in local language and English. Let’s also assume that you are also heavily regulated due to internal policies or country specific legislation. Let’s just for the sake of it say you are a medical company publishing your product information online.

First off, the local language translation is no problem as most major search engines have the ability to understand that a local language translation on a local TLD is more relevant than one in English. (if your local language isn’t English). But what about your centrally governed original text in English? This one will be the same for all URLs and thus duplicate content on all websites. IP range redirection is not considered cloaking, but Google will make notice of this redirection and probably value you less because of it. How do you solve this problem then? Well, local links is one way, but local links to English versions are difficult to obtain when marketing laws don’t accept marketing of medical content in most countries and when most people anyhow would link to the local language version. Then, how the heck do you deal with this problem. Especially when you are a corporation dealing with a growing affiliate/spam problem and want to maintain within the guidelines of Google.

I must say that I have not come to a conclusion in this question except for internal linking. However, that is a delicate issue as you cannot do this excessively and between languages. Adding your website to Google Webmaster Tools might help, but this doesn’t work for your rankings in Yahoo and Bing. However, adding the meta tags for geo location works in Bing, perhaps in Yahoo, so you should probably go ahead and do that. But perhaps your CMS doesn’t allow for this as your website is built in 1996 and so you have to solve this some other way.

At this point you probably don’t see the relevance in this anymore. But seriously, you should as you are probably stuck in one of them commerce systems from -84 that is localized for each country. You probably see the increased cost in those orders that end up in the wrong country and the increase in cost of customer service that is redirected due to international clients calling the wrong customer service or complaining that they cannot reach you in time. This is of extreme importance to you. You need your traffic to end up in relevant places. Thus, you have to solve these types of content problems when planning your online communication and publication of content. Duplicate content is one of the most herrendous problems out there as the web has turned social and sharing has turned into one of the major practices out there. You do not only compete with yourself but with the rest of the world about owning your brand. Thus you’ll have to plan for this when starting your communication efforts in social media.

Don’t just use RSS-feeds and auto publishing. This will lead to as much scraping and dup content as it will lead to irrelevance amongst those who are interested in what you have to say. I should probably write a complete blog post about that later. However, when working with governance of duplicate content you should give your employees enough freedom to work and rework the content you offer into a local context. I usually talk about allowing a culture washing occur before a content is published in any language on a local market. Do not care if it is how you would say it in English to an English audience, but care how it would sound in English to a local reader. Think in culture, level of English and always provide a link to further reading on the English website as well as a dynamically created purchase link depending on referral country. (In the above given example)

All problems aren’t built like the one above, but it should be considered how you maintain unique content in all channels and not just on every specific website. As I’ve previously argued, a website is irrelevant on the web if not considered in terms of its unique and separate URLs.

3. Content ownership

A much more easily consumable question is that of content ownership. If you want to be successful online today, someone has to be the owner of the content you publish. This means that someone, and not something has to answer for it. If you publish a press release, you should clearly denote who will be in charge of answering to bloggers, journalists and such for each specific place where you are active.

Let’s say you are active on Facebook in customer service issues, and you publish a piece of content on your website about prostitution scandals in Rwanda, then you also have to take into account that these will probably surface on Facebook as well. If you in your press release have directed questions to a specific person, then that should be the person who answers the questions, or at least appears to answer the questions on Facebook. I’m just saying, that users will expect the sender or owner of the content to actually have a dialogue with them about this content.

Without an owner of content you will have huge governance problems internally as well. If you do not own the content, you will have no responsibility for its performance. You have to be measured on the contents success online and you have to be accountable for its performance and the reactions that come from it. You should be rewarded if successful and probably spanked with a broom if it is not. If there is no sense of ownership of your published content, then your website will become a yard of waste rather than a communicative tool used by your employees to surface resources valuable to your corporate and product communications.

4. Content Findability

Content findability should be split into some separate thought silos of worries. First of is that of governing your contents structure, layout, design and disposition. This is important and can be monitored with several different tools. You should in contrast with this post

  • write a lot of bullets
  • Use plenty of sub headlines where it makes sense
  • Write short and concise sentences

All so that your on page findability is increased and the user easily can find what they search for on your web pages.

Secondly you should look at increasing your findabillity on other platforms than your own. If you look at a SERP like the one below, which is for my name. You can see that I cover the first page with search results leading to MY owned content all over the web. I have several social media accounts, blogs, newspaper articles and website pages that I have more or less control over. This makes it very easy for me to govern the first impression a user gets of me when searching for people with the same name as I.

Third, you should look at findability from an on site search perspective. It is important that the approximately 25% of your visitors, who uses your internal search engine, get the proper results in their searches. If you cannot afford to buy your own search engine, Google offers you a free one that you can add to your website.

5. Content Shareability

To make some content shareable you have to give it a story worth talking about. You can either do this on your own website in a way where you add some reflections about why you added the content and how users can use it. Or you can do it by simply telling a short story whilst publishing it to an external platform. A good rule of thumb is that you make a statement or a promise, you then close discussion by narrowing the statement with a question. H&M is really good at doing this on Facebook and I suggest you visit their Facebook Page and have a look.

The way in which you govern this in large corporations and how this is then shared between users is basically up to you. If you set up a KPI you should try and measure change over time on whatever you measure as this tells you the pace and movement/direction of your content. You can also use one of the many social media monitoring tools out there on the market to see how your initial published content is being shared between users. Don’t forget the story about the story. Add a tracker through bit.ly or #hashtag in order to see if people pick up the keywords you are trying to push.

Concluding thoughts

I am unsure whether or not this post makes any sense, but I wanted to write something about content governance/production to see if I could spark up a dialogue with other people interested in the same thing.

Possibly related posts:
September 10, 2010

Google Instant – a Reminder that Google is not equal to SEO

Those of you out there who put an equal sign between SEO and Google should really try to reconsider. Google is only one of many popular search engines out there and if you aspire to be a good international SEO you have to consider all of them when considering how to bring your clients or your company success.

Lets’ start with a little video about what Google Instant is all about so that we know we’re talking about the same thing here. It is really nothing more than an advanced Google suggest with an annoying ajax search result page.

Google Instant – what it means

There are a lot of the same old buzzmakers trying to make a buck out of the next thing out there. I don’t know when Steve Rubel started making his name as an SEO. I’ve always seen him as a PR-consultant. He writes in his blog that Google Instant will kill SEO. Without even taking the other search engines into consideration, my take on the story is that it won’t even kill SEO in Google.

Google Instant actually improves search, and thus also optimization, delivering new content to the surface of the searchers. It does decrease the importance of bulk links somewhat as search will be increasingly long tail as the numbers used in search queries is likely to increase. This also puts the demand on publishers to increase their lot of relevant content.

If you as an SEO want to optimize for the web, only a few of these long tail searches will be relevant as you will only put effort into what brings you cash back. The demand that you as an SEO understand analytics as well as conversion will increase as you probably wont be able to bulk traffic in a full vertical any longer. You will have to specialize and know what kind of searches bring you the dough. You will quite fast see what search queries appear as referrals in your analytics and thus make the needed changes in order to rank for those keywords that convert.

In many ways, this kind of search will become even more predictable. Not less predictable. It is obvious that Mr. Rubel doesn’t put much thought into the combination of analytics and on-page/off-page optmization. If you don’t focus on the analytics when working with SEO you are just about as relevant as a politician answering a question they don’t know the answer too.

Google Instant is an adaptation to the social web – and so should SEO’s

Many SEO’s don’t like social media as anything else than a place to complain about that they don’t like social media. An SEO who wants to be successful in the long tail really should read up on some of the old spam tactics created by auto-generating content from the hidden web. Then translate those theories onto what you can do when you have actual people producing content on your platform. That’s user generated content, I’m talking about. That’s social media.

Users will always be able to generate long tail discussions you wouldn’t ever be able to come up with. A good SEO knows how to take advantage of this. Not only by using the content generated, but also using the internal links created with user generated tags and themes created by what ever system you’re using. If you’re a WordPress user then play around with SEO smart links and your category pages and you’ll see what I’m after.

Bad SEO’s won’t stand a chance – that’s correct

Google Instant will make SEO more of a science, a more exact science. It will require thought, analysis and what ever good SEO’s have been doing for many years. It won’t however kill SEO as a practice. It MIGHT disqualify some wannabe’s, but I am sure that they will have a market as well as it is quite easy to remove Google Instant as your display mode.

Google is NOT equal to SEO

I should end this post with a reminder that Google is NOT equal to SEO. WE still have YouTube, Twitter, Facebook, intranets, On site search, catalogs and all the other very popular search engines to take into consideration. Those are equally important as they can bring you extremely relevant traffic and improve the way you communicate with your website visitors.

SEO is about improving web communication. Get information searchers the information they need. Get buyers the stuff they need. AS FAST AS THEY WANT IT. Good SEO is not about getting a lot of traffic, but getting the traffic that wants your stuff. All else is a waste of both’s time.

Now. Google Instant will be very interesting to study from a user behavior perspective. As the data will (hopefully) be more standardized, as the number of different long search queries will decrease, we will probably also have to adjust our tactics towards a web where we get less relevant traffic than we’re used to on some previously highly converting keywords. However, I am sure users will adapt and learn…. or a competitor to Google will really push for a change back to what the users like. Only time can tell.

August 27, 2010

Checklist for Social Search

There has been some talk about social search, but I lack the complete guide to it, the general reflections and the updated know how. Very few people have conducted real tests and there is generally only guesses out there. I have studied search within Facebook for some months now, perhaps that’s why I’ve focused so much on Facebook in my blog posts. However, there is more to social search than just what “the book” has to offer.

Social media as the #1 referrer

No matter how things work in different algorithms and how pages rank in relationship to each other, it is quite evident that social aspects are becoming increasingly important for search. For many news pages, social media has become the number one referrer and we are seeing some evidence that websites in the travel vertical and other such experience/story based content websites are moving towards the same pattern.

Traffic is one thing, what about conversion

I’ve covered social media and conversion in some of my older blog posts. If you seem to get an increasing number of non-converting visitors from social media, you should probably have a look at those. Or perhaps flip through my latest keynote on Social media money.

Social Search – yeah… how to get the traffic…

But I’ve already written about that and want to post a fluffy Friday post. Thus, I’ll continue writing about traffic. What you have to think of when optimizing for social search is that you are dealing with two types of searches.

  • Content that you search for
  • Content that finds you

The general idea behind social content search is that content closely related to your network ranks better than content that is far away from your network. In practice this means that Google will index search results that people closely connected to you have shared, whilst Facebook will rank content that is liked by many of your network buddies.

The general idea behind content that finds you is such content that is being posted to news feeds and that is being discussed on the real time web. This means the content currently ending up in top list directories for tweets, that is currently being posted to your query tabs in TweetDeck and that is being liked, commented and re-shared in the news feeds of Facebook. You do not search for this content but you search for what’s on the mind of the person sharing the content. The person behind the share is in focus rather than the share itself. If you’re in the same mood or have an opinion, you too will share the content to your network listening in on what you are thinking.

I’ve thought about creating a method on how to rank in social search engines, but I sort of realize that this is too much of a deal as I would have to cover so many aspects. However, this blog post shall give you a short list on how to plan your work in order to sort out the unnecessary and thus also increase your focus and likelihood of success.

  1. Start off by creating a list of technologies where you want your content to be findable.
  2. Make an inventory list of what kind of content people are sharing on those platforms.
  3. Have a look at your content (now back at me, then back at your content), does your content resemble what is being shared on the desired platform.
  4. Turn your content into social objects by making them shareable (Ie. turn it into content you find people are sharing on the platform) and indexable by the platforms of your choice – as an example you can read my post on how to index and rank on Facebook.
  5. Make your objects easy to share. For example create news releases about product pages, create videos for tutorials, create images and other easily consumable content available. Ie. make it web content.
  6. Make your objects easy to interact with and connect them to the social platform. For example, add comment fields, ratings and such interactive ingredients.
  7. Oh… yeah… start talking to people online… or create a sophisticated bot to do it for you (me)
  8. Choose social sharing buttons with care… perhaps don’t add them at all, but think of how to integrate the sharing into how the user interacts with your page – the like button is an exception as it just as ratings is a statement which is induced by the reaction of the user “liking” the page – something I should write a post on… anyhow..

Please continue in the comments field. Have a nice weekend.

August 5, 2010

Evidence that Static, User Friendly URL’s are Still better

[This blog post is a rewrite of a Swedish post by Magnus Bråth who published his findings on the Swedish SEO blog sokmotorkonsult.se. If you want to do the same with any of my blog posts into your own language, please ask first.]

It is not usual that I do rewrites on this blog. However, since I get the question about static vs. dynamic URLs a lot, I thought this post might be interesting to rewrite for those of you who don’t speak Swedish. Magnus Bråth, a Swedish SEO, with many years experience of building competitive performances within various sick verticals, made this tiny little test to see whether or not there was any sense in the assumption that Google didn’t give any extra value to static or “user friendly urls”. Google say’s that they can read the dynamic ones equally well. As usual, a lot of people jumped to the conclusion that it was now as good to use dynamic urls. Some even said it would be better to use them. For some complex calendar or forum indexing this might actually hold. However, what Magnus found was that if you can keep the keywords in your URL, you are better off than if you don’t.

SEO blogMagnus test virtually went down something like this:

  1. He wrote a blog post that he wanted to make rank for the Swedish keyword “installera antivirus” which logically means “install antivirus”.
  2. The URL contains the keyword “installera antivirus”
  3. Nothing in the post else than this includes the first of the two words “installera” of the keyword
  4. However, when you search for “installera antivirus” the post ends up as number 1

Perhaps Magnus only wants to show the strength of his blog but regardless of his reasons, this helps us draw the natural conclusion that it does effect indexing (at least, if not rankings as well) to use static and user friendly urls. If it wouldn’t then his post simply wouldn’t rank or be indexed due to the keyword being placed solely in the URL.

This also means that those of you talking about going to Google to know what the best advice on SEO is for the moment, just might consider thinking twice before talking to loudly. It just might be so that they “just don’t tell the truth” all the time.

A simple but effective test to display something that a lot of people are thinking. Thank you Magnus for spending some time for sharing this test.