Digital Marketing Blog
September 24, 2013
Keyword data seems to be disappearing from Google Analytics. So. How do we know what to optimize our websites towards? How will we be able to know what supporting keywords to use in our texts? How will we be able to run keyword targeted conversion optimization on the organic traffic we have earned.
No worries. We’ll get to that. Obviously Google says they want you to work with your website, to increase your quality and better serve users with better content. However, you will now have to resort to paying some in order to get the juicy stuff.
Below is a set of tools that you can use in order to know.
1. Wordstream is the best keyword tool out there – use it
- Wordstream is a tool that helps you to create Adwords keyword groups for your content. They help you find the most profitable niches and are available in many different languages.
- I suggest you create a subscription account on Wordstream. It will cost you some money, but it is surely worth it.
2. How to bid on Adwords to get keyword data
- Use Keyword tool – generate dynamic tracking URLs
- Do not bid broad as you will lose a lot of data doing so as your exact match will not be displayed in your account
- If you are simply interested in search volumes – use Google Keyword tool AND bid with minimum bids on all keywords you can possibly think of
- Bid narrow and use negatives.
3. Activate internal search tracking in your Google Analytics
- I find it awful that so few companies have activated their internal search tracking. MORE than 20% of the people on a website use search onsite. You will get the exact match on the keywords to optimize for PLUS deliver a brand building experience to your users if you work on your internal search.
- To activate it. Open the profile you want to track > Click Admin in top right corner > Click View Settings > Enable Site Search to be tracked at the bottom of the General settings page > The parameter that you need to enter is the search query generate in the URL when someone searches for this blog – if you search for “search” – http://jesperastrom.com/?s=search – the parameter you would enter would be ?s=
4. Download the “Not Provided” kit
- This was a kit – actually an analysis framework – that one of my former students Markus Frick gave me a tip about on Facebook. Download it from here.
Ok. So that’s 3 ways. I know you got one to add. Comment away!!!
October 21, 2012
Do you have a mobile CSS on your website? I think it is time you’d better get one. Especially if you work with lead generation, sales or any type of transaction on your website. Mobile traffic is increasing like a mother fucker and you need to be prepared for it. Are you still asking the question of whether or not to build a native app or build a mobile website? Don’t. Build a website. First. Then see if you have the need for an app.
What is cloaking?
Going back a few years in time, a practice was used to deliver one content type to the visitor and another to the Google bot. This would make Google believe that your web pages were filled with valuable text and image content that was highly relevant for some keyword. Whilst, in fact, your page, when served to a normal user, had all but copy on it. It was simply a cluster of your most converting content put into a structure where it would convert the most possible visitors.
Webmasters did this so that they would rank for stuff their pages didn’t really deserve to rank for. This practice was called cloaking.
Google solved the problem by further developing their algorithm so that it would detect this practice. They also added manual checks of websites in order to rule out any sneaky bastards who had been able to reverse engineer the new changes to the algo. In a sense, cloaking is a part of the past, at least when it comes to PC/Mac to web interaction.
Mobile cloaking is here
However, as the web has expanded to also include delivering access to mobile devices our cloaking skills are experiencing somewhat of a renaissance. People are to a greater extent picking up their mobile to search for whatever they want to find. From our analytics we have seen mobile usage, of the websites we monitor, increase gradually. For the past few months, this traffic increase has started to accelerate.
So, what should you do with this traffic when it comes to your website? My suggestion is that you cloak the living shit out of it.
So what do I mean when I say “cloak the living shit out of it”. Well, basically I am talking about offering another interface for your mobile visitors than you serve your computer browser users. Fair enough. But I mean seriously change the way it looks in the mobile. Users are lazy. Remove the unnecessary stuff. Give them only the essential for conversion. Either to a lead with an e-mail form, or through a simple one click to phone number purchase. Google doesn’t seem to penalize you for this. Perhaps, because you are actually delivering a better page experience than the one with all the text on it.
What is your Mobile cloaking opportunity?
This will not only improve your conversion rates, but it will also improve your usability for the visitor through the mobile phone.
And it is here that Google’s dilemma becomes escalated. They will probably never be able to penalize you for these differences in delivery. Users will not like pages with loads of text on them in their mobile phone. Thus, as Google has the ambition to deliver great content, they will have to adapt to finding a way to rank good mobile pages. The only way they can do that today is to use the web css delivery, regardless if you serve your visitors with something else through the mobile phone.
From all I can derive from the numbers, Google will rank you in the cellphone based on how your webpage is behaving to a “non-mobile” visitor. It will give you some advantage for localized content as well as for mobile CSS. HOW this mobile CSS displays the content on the other hand, seems to be of lesser significance.
People are searching the web through the phone. Ok. The search behavior is a bit different. It pays to keep an eye out for what Google suggest is giving the users as options for your specific niche keywords. Use it. People are lazy and more prone to use what is offered to them. Either way. You should be seeing more traffic from the cellphone. Both social as well as search traffic.
So. That’s an opportunity for you.
October 11, 2011
There is this thing referred to as Black Hat SEO. The definition is always debated amongst search engine experts. In Wikipedia they define it as the deliberate manipulation of search engine indexes. If that was the case, then all SEO’s and companies wanting to advance in the search engines, would be black hat SEO employers.
I define Black Hat SEO as activities where you automatically steal, hack, generate filthy, link drowning content to support your greedy ambitions in the search engine result pages. Practices such as content scraping, cloaking, comment spamming, database mining, pingcrawl, auto submitters and naturally – hacking of exposed servers, falls into the category.
Beginning my Black Hat Tools journey
I have previously bought links. A practice I recognize as gray. I have never hacked a website, or employed automated link building tools to create serious link influxes to my websites. Yes I have actually automated social bookmarking, but not the nasty kind.
From today, that will all change. I am tired of not knowing, and will start a journey to research which of the tools out there work and which don’t. I have set a budget of $10,000 USD. My goal will be to purchase tools until that sum is spent. Here are my rules:
- The majority of content on the website shall be auto generated
- All back links shall be built automatically
- I am allowed to engage with users in social media, but I am not allowed to generate the content myself
- The domains have to be new
- I am not allowed to break any laws
What tools shall I test?
I have a short list of approximately 10 tools including some I am going to order custom versions of. I suggest you send me an e-mail to firstname.lastname@example.org if you want me to try out your service or if you know of a service you think I should buy and test.
Why do I want to try Black Hat tactics?
First of all I want to know if they make any sense. Can you still spam your way to the top in Google? I want to learn every detail of executing them. I need to know. Then I want to share this knowledge. What works shall be put into the toolkit. What doesn’t work shall be told. If we are to progress, we need to research, investigate and examine.
Plus. I think it will be loads of fun.
September 27, 2011
If you want your texts to be readable by many people, you need to write in a way that people understand. Simple enough, right? The easier to read, the more accessible. Simple is better than difficult and short is better than long.
I have regularly fallen victim to my own ego when writing tutorials extending them beyond the limits of what is readable on a regular screen. Thank God for mobile devices making long texts readable again.
In the eyes of Google, the texts on your website are basic, intermediate or advanced. That is, the reading level of them. Yeah, Google actually tries to understand your texts and then categorize them under basic, intermediate and advanced labels.
This is a natural evolvement as Google wants to rank the most relevant content in the top of the search engine.
If a user cannot understand the content Google serves, the search result will be less relevant than one that the user actually gets.
Write for the level of your audience
But hold on. Traffic is really just one aspect of SEO. Don’t rush out and rewrite all of your texts just yet. Traffic might be lower as a result of getting only people with an advanced reading record to your website, but perhaps it doesn’t make your website less relevant for these users who actually end up there. I guess the thing here is to know your preferred/converting visitor demographics/sociographics and write for them.
Consider a camera retailer. You are selling cameras and you hire the photographer nerd of a lifetime to write your copy. The person will write about the camera knowing all the right jargon. Have a look at this camera description from Amazon below. (click on image to get to site)
The standard camera buyer might not know all the widgets, gadgets, settings and data that needs to be in place for the perfect shot. They might be searching for terms such as cool cameras. As on many camera websites, the only info you get about the camera is this info. I would say it is advanced, and so would probably Gooogle. Thus, although it is long tail, it will probably have a difficult time ranking on its own if there is a page with both this information and less difficult info to grasp. Ie.
- Two pages with the same technical description mumbo jumbo
- The one with the jibber-jabber, extra text with simplification will get more traffic, even for searches of the tech terms
Amazon realizes that it doesn’t have the resources to rewrite all product descriptions in its database and thus it allows for user reviews where the product usage and likability can be discussed. Naturally, the more common the language, the tougher the competition (hopefully) and the more potential traffic. By using both you are covered in the advanced long tail as well as in the common peak.
Now consider poker affiliate, a perhaps more clear example.
- Who is the user that will convert into a regular player?
- Is it the experienced player or is it the person who have watched high stakes poker on GSN on a weekly basis?
- What will they search for when they want to convert into a regular player?
- What is their terminology and what level of understanding do they have of the content you offer?
- Do they really understand the meaning of shove, felted, three bet or position?
Perhaps they are on the level of flop, turn and perhaps river.
When hiring an experienced poker player you might get the attention of the loyal poker elite, but you might not get the attention of the beginner. They might on the contrary be scared by your advanced level and think that it is not for him or her. The reading level of your content thus becomes significant not only for the kind of traffic you get to your website but also what kind of conversion you get out of that traffic.
Does this “REALLY” matter for rankings in the search engine?
So, does reading level have anything to do with SEO and the results in the search engine? Google’s latest update called Panda seemingly punished those who use complicated or more advanced texts and premiered those who used easier language.
The job of an SEO has traditionally been to collect links, and it will probably be the core of the work duties for quite some time on from now. However, since the Panda update I have several tutorials that rank the living shit out of websites that have four or ten times as many links as I do. Perhaps not because the level of the tutorial, but more likely because of their length.
I believe, as I have often argued, that many SEO’s need to update the way they target themselves. From traffic, to conversion rates. Conversion rates are the true measure of content quality and should thus be used as the main metric for relevance. I think that if we shift into measuring conversion rates in front of traffic increases, the content quality – reading level, will improve as a result.
How to find the reading level of your website
In order to find out the reading level of your texts you should write site:yourdomain.com in the search field of a Google search engine. This will list all the pages of your website that Google have indexed.
To get the above percentages you also need click the reading level link which can be found in the left hand navigation of the new search GUI.
March 29, 2011
I see a link to an article or blogpost bashing SEO as a practice just about every week. I have started to care. In the beginning I didn’t, but considering I hear so much bs from so many people right now I have to write something.
SEO is only partially Google optimization. It is about Facebook, Twitter, YouTube and all other websites where you really should be present if you want to be visible on the expanded web. But this article is not about that either as both myself and others have written about this before.
This article is about us humans and why we search. It is also about why search will always be a factor regardless of what type of technology we use. Third, this article will be about how to move into a new domain of search that is data driven rather than URL-bundled.
Understanding the Human searcher
When we produce content, regardless of channel, we have to consider the domain and the properties of the community in which we publish it. People search in different ways depending upon where they search. I usually recommend companies to go search for similar content to what they are producing before giving it the proper title. They should record every search they make until they find what they need and want.
They should then do a standard volume vs competition analysis and produce content for the proper keywords of that content. If they want it to be found by relevant users, they should also consider what the searcher is searching for, naturally. Most companies don’t. But they should.
If you publish a video to YouTube, don’t forget to use the word “video” in your title as people use the word “video” when they search for video. If you publish an update to Facebook, then remember to add a question or statement to it so that people have a way of responding too it, and thus leveraging the power of social network interaction. Remember that the perspective of a social network searcher is rather “what is my friend doing” than “cheap sunglasses discount coupon”. This means that the searcher of the social network start by searching for a friend, ends up on the news feed where they see a comment made on your wall. They follow the link of the comment and reach the discussion of the wall post. They get indoctrinated in the types of words used and follow links that seem to be the conformed consensus of the active respondents.
If you publish something to Twitter, then try to own a hashtag as these types of “back channels” are used in order to categorize content and give different 140 character expressions a semantic context. If you then push something in a forum, you can use the hashtag so that people will search for it to find out more about the topic you are discussing.
To understand the human searcher is to understand purpose. There is no search without a purpose. Your greatest mission in life as a content producer is to use all assets available to fill the purpose of those searchers that search for content which is relevant for you and your business. The human searcher does not roam, they want to find salvation to their perceived desire. This will never die. This will never go away, and it is the very core of what search is all about.
Thus when optimizing content for this purpose we need to begin with the humans purpose in mind.
Technology is not the biggie
When thinking of SEO in a holistic way, you need to understand that all your activities online AND offline matter. There are simply too many signals telling search engines and social networks where your content should be placed with regard to other content.
Search engine optimization is technology independent as your main purpose cannot be to produce content that matches all these signals. Especially when considering you probably don’t know which they are and what they will do for your content. Your main ambition should be to optimize the flow of information so that it aligns with your purposes.
This means that you should probably offer ratings and reviews on your website. If you cannot do that, you should feed it from another website onto yours. Your payment is a link and your sacrifice in Google ranking for the competition intense keywords. However, by adding the content you are also relating yourself to the information and as long as it is relevant, you will become more relevant for the end user and thus increase the likelihood of them returning.
So what does content satisfaction and aligning content into sets have anything to do with independence of technology. Well, in the end it is all just content displayed in different formats. However, as more and more of content formats are becoming standardized or easily converted through simple processes that you can automate, the technology you can use is becoming of less relevance. Regardless if you publish to your own platform or if you publish to another place such as Facebook.
It is the content that matters, not the technology that contains it.
Technology should only be used when it simplifies the challenges of communication, not because it is the new black. If you want to optimize your content for a technology such as Google, and at the same time disregard other technologies, then I feel you are making a big mistake. However, if you focus on the purpose driven end user, you will probably gain a lot more.
If you don’t have the content that will make you found, then align yourself with someone who does. Co-ordinate your communicative efforts with what’s out there already and then take advantage of the relationship a complimentary position might offer.
Which brings us to the Data
Cause there is nothing as exciting as considering data in closed environments, sourced from open environments. Naturally I am talking about sites vs apps. In the coming few years we will see the death of the URL and the complete explosion of the dynamically generated response app.
Today, searches are limited in their relevance as most of them return URIs from an index and list these URLs in a list. The content that becomes available through these resources is created before you search for it and thus it is most likely not taking into consideration what you plan to do tomorrow.
The data driven search engine optimizers must understand this. In the data driven search, made through an application, the user will ask questions such as “restaurants I like in Gothenburg where 2 of my friends have been and that is similar to that one restaurant in Sundsvall”. A search engine of today have NO possibility to reply with a specific content meeting your demands of information. However, the data driven web will.
They will source information from all of your friends, all image websites, all review websites and all other such resources needed to understand what you “like”, which of your friends have been at a specific venue and if they use the same furniture and menus as the other place you want it to resemble.
This is why I say that you need to align your content with other relevant content, and thus relating it to networks of people, brands and other such domains that are necessary for you to become findable by someone who has never met you, yet has a lot of opinions about other things similar to you.
The Holistic approach to SEO
So, as you can see from above, this was not the Holistic thinking that is usually put out there. My main point with this post is that you should focus on the humans doing the searches, their needs rather than their preferred technology and then understand that the data driven web will truly change the game as users will ask for more than what you want or are able to give them.
A holistic approach to SEO doesn’t take aim at anything. Rather, it takes aim at being findable by relevant Humans in everything. Regardless of technology or current new black.
November 16, 2010
Voices within my network were raised earlier this week when Google published they have managed to further crawl flash websites. In short Google is proud to announce that they are able to crawl text within Flash better than before. Nothing wrong with that. Actually, there is nothing wrong with Flash at all. It is the misuse by retards that is the problem with Flash. You know, Flash don’t kill websites… people kill websites…
To few people have been taught how to properly include Flash files. People are also lazy and thus don’t care to learn more technologies than one. This leads to an excessive use of Flash where CSS3 and HTML4 (not the over hyped and still not accessible HTML5) do a better job. The benefits of jQuery shouldn’t be exaggerated either. jQuery is wonderful when used in a user friendly manner.
What makes me tick however are the inconsistencies in Google’s message here. I agree with Hessam’s comment (scroll a bit amongst the comments or Cmd+F hessam). The biggest problems with Flash aren’t solved just because Google indexes the websites. Flash still:
- Renders horribly, if at all in mobile devices
- Loads slower than I take a shit
- Has a tendency to give too much freedom to designers giving users a series of ADD experiences
However, Flash is pretty darn shiny if used correctly, with a fallback and with a Mobile, Print and web CSS. But now I am missing the point again. Let’s go back to the inconsistencies in Google’s communication. Have a look at the first 10 minutes of this 60 minute I/O talk from Google.
Don’t they repeatedly say: “There is no text on this website”?
Same goes for all Flash websites out there. Regardless of what’s in the SWF file, these websites have no text on them. And regardless if Google can crawl the website or not, it takes them a hell of a lot more time to crawl a heavy Flash website than it does a standard HTML website.
But HTML websites tend to look so pale… bah…
But what do the users like? Let’s have a look at the top 50 most popular websites online.
- Microsoft Live
- and it goes on…
Users don’t seem to crowd to design heavy websites do they? What are the users looking for then… if they are not looking for good looks? They are looking for good content. Looks are secondary. Most designers get it wrong anyhow. They want the “picture” to look nice. Not to be functional. In Flash you can make the prettiest things. But they are a pretty image. Not a functional website.
If you design a good looking website based on the full image of the website rather than the parts. Then you do better in drawing a painting. However, if you make each one of the parts look and work well, then you have the correct components to help the user USE your website.
Regardless how good looking your website is, you need to have users actually wanting to use it. Not only look at it. Especially with the modern and social web. You NEED to have a website that is not only indexable in more than one way, but you also need users to interact with your web content in order for it to organically get shared between users through their different networks.
Anyhow… I just needed to be a part of the discussion…
Possibly related posts:
- Related posts on Flash websites
- Adobe: HTML5 creeping in, Ray on Mac rigid with GPU | GSM reviews …
- Web Design
September 10, 2010
Those of you out there who put an equal sign between SEO and Google should really try to reconsider. Google is only one of many popular search engines out there and if you aspire to be a good international SEO you have to consider all of them when considering how to bring your clients or your company success.
Lets’ start with a little video about what Google Instant is all about so that we know we’re talking about the same thing here. It is really nothing more than an advanced Google suggest with an annoying ajax search result page.
Google Instant – what it means
There are a lot of the same old buzzmakers trying to make a buck out of the next thing out there. I don’t know when Steve Rubel started making his name as an SEO. I’ve always seen him as a PR-consultant. He writes in his blog that Google Instant will kill SEO. Without even taking the other search engines into consideration, my take on the story is that it won’t even kill SEO in Google.
Google Instant actually improves search, and thus also optimization, delivering new content to the surface of the searchers. It does decrease the importance of bulk links somewhat as search will be increasingly long tail as the numbers used in search queries is likely to increase. This also puts the demand on publishers to increase their lot of relevant content.
If you as an SEO want to optimize for the web, only a few of these long tail searches will be relevant as you will only put effort into what brings you cash back. The demand that you as an SEO understand analytics as well as conversion will increase as you probably wont be able to bulk traffic in a full vertical any longer. You will have to specialize and know what kind of searches bring you the dough. You will quite fast see what search queries appear as referrals in your analytics and thus make the needed changes in order to rank for those keywords that convert.
In many ways, this kind of search will become even more predictable. Not less predictable. It is obvious that Mr. Rubel doesn’t put much thought into the combination of analytics and on-page/off-page optmization. If you don’t focus on the analytics when working with SEO you are just about as relevant as a politician answering a question they don’t know the answer too.
Google Instant is an adaptation to the social web – and so should SEO’s
Many SEO’s don’t like social media as anything else than a place to complain about that they don’t like social media. An SEO who wants to be successful in the long tail really should read up on some of the old spam tactics created by auto-generating content from the hidden web. Then translate those theories onto what you can do when you have actual people producing content on your platform. That’s user generated content, I’m talking about. That’s social media.
Users will always be able to generate long tail discussions you wouldn’t ever be able to come up with. A good SEO knows how to take advantage of this. Not only by using the content generated, but also using the internal links created with user generated tags and themes created by what ever system you’re using. If you’re a WordPress user then play around with SEO smart links and your category pages and you’ll see what I’m after.
Bad SEO’s won’t stand a chance – that’s correct
Google Instant will make SEO more of a science, a more exact science. It will require thought, analysis and what ever good SEO’s have been doing for many years. It won’t however kill SEO as a practice. It MIGHT disqualify some wannabe’s, but I am sure that they will have a market as well as it is quite easy to remove Google Instant as your display mode.
Google is NOT equal to SEO
I should end this post with a reminder that Google is NOT equal to SEO. WE still have YouTube, Twitter, Facebook, intranets, On site search, catalogs and all the other very popular search engines to take into consideration. Those are equally important as they can bring you extremely relevant traffic and improve the way you communicate with your website visitors.
SEO is about improving web communication. Get information searchers the information they need. Get buyers the stuff they need. AS FAST AS THEY WANT IT. Good SEO is not about getting a lot of traffic, but getting the traffic that wants your stuff. All else is a waste of both’s time.
Now. Google Instant will be very interesting to study from a user behavior perspective. As the data will (hopefully) be more standardized, as the number of different long search queries will decrease, we will probably also have to adjust our tactics towards a web where we get less relevant traffic than we’re used to on some previously highly converting keywords. However, I am sure users will adapt and learn…. or a competitor to Google will really push for a change back to what the users like. Only time can tell.
August 5, 2010
[This blog post is a rewrite of a Swedish post by Magnus Bråth who published his findings on the Swedish SEO blog sokmotorkonsult.se. If you want to do the same with any of my blog posts into your own language, please ask first.]
It is not usual that I do rewrites on this blog. However, since I get the question about static vs. dynamic URLs a lot, I thought this post might be interesting to rewrite for those of you who don’t speak Swedish. Magnus Bråth, a Swedish SEO, with many years experience of building competitive performances within various sick verticals, made this tiny little test to see whether or not there was any sense in the assumption that Google didn’t give any extra value to static or “user friendly urls”. Google say’s that they can read the dynamic ones equally well. As usual, a lot of people jumped to the conclusion that it was now as good to use dynamic urls. Some even said it would be better to use them. For some complex calendar or forum indexing this might actually hold. However, what Magnus found was that if you can keep the keywords in your URL, you are better off than if you don’t.
Magnus test virtually went down something like this:
- He wrote a blog post that he wanted to make rank for the Swedish keyword “installera antivirus” which logically means “install antivirus”.
- The URL contains the keyword “installera antivirus”
- Nothing in the post else than this includes the first of the two words “installera” of the keyword
- However, when you search for “installera antivirus” the post ends up as number 1
Perhaps Magnus only wants to show the strength of his blog but regardless of his reasons, this helps us draw the natural conclusion that it does effect indexing (at least, if not rankings as well) to use static and user friendly urls. If it wouldn’t then his post simply wouldn’t rank or be indexed due to the keyword being placed solely in the URL.
This also means that those of you talking about going to Google to know what the best advice on SEO is for the moment, just might consider thinking twice before talking to loudly. It just might be so that they “just don’t tell the truth” all the time.
A simple but effective test to display something that a lot of people are thinking. Thank you Magnus for spending some time for sharing this test.