When it comes to making URL structural changes to your website, it is very important to ensure you 301 redirect your old URLs to the new URLs. Common cases of doing this are migrating between pages on a site or migrating between sites.
Doing 301 redirects for migrations has SEO and usability impact and if not followed correctly, may cost your site valuable organic traffic and rankings.
4 reasons why you should and must 301 redirect your old URLs:
- Search engines (such as Google) most likely have crawled and indexed your site on the SERPs. If a user then queries and finds your site organically on the SERPs, it would be poor user experience if the link found lead to a 404 page.
- Search engines may recrawl your site via the old URLs and if stumbles upon a 404 page, will most likely drop you out of the SERPs if they can’t see the association to the new URL. This is also because of poor user experience as search engines place high importance on ensuring users find what they’re looking for.
- You will lose link juice from external sites as these trust & authority juices aren’t flowed from the old URL to the new URL. Loss of link juice means your site will lose authority & trust: 2 important factors in SEO.
- If you did not update your internal links to point to the new URLs, you will have a lot of broken links too which will negatively affect your internal PageRank flow.
You’d be surprised but I have seen websites lose 50%+ organic traffic due to this oversight. Imagine if you ran a multi-million dollar e-commerce site. What are the implications of not doing this?
Continue reading »
One of the key information that’s provided here is the listing’s organic ranking (cd parameter). This can be found in the referral URL property (or document.referrer when referring to the DOM).
It’s been more than 1.5 years since the announcement so I figured that the gradual roll out would be almost complete (I still see instances of the old referral URL being used though) so I decided to implement a filter for Google Analytics that will pull in the organic ranking data and show it in the keyword reports.
Before we get into it, there’s something important to know about the cd parameter. Traditionally in SEO, we’ve always known the SERPs to contain 10 organic listings (as shown below).
Continue reading »
I’ve taken some time out to write a script that provides a nice API to access Google cookies. If you’ve seen the Google cookies before, they can look pretty cryptic and will require you to memorise the syntax of how the cookies are formed which you don’t necessarily want to do to save brain space.
I won’t really go into the intricate details of Google cookies so this post will assume you know what you’re looking for. I may write up a post to explain more in-depth how Google cookies work later on. In the mean time, you can watch this presentation by Google on cookies (it’s pretty good!) or read the documentation to find out more about Google cookies.
So how is this useful? Well it really depends. You may use it to read GA campaign values and integrate it with your CRM system to track where your leads/sales are coming from or write custom scripts that integrate with GA (i.e. custom variables). It’s really up to you!
Anyway, on to the script.
Continue reading »
It is a known problem that search engine crawlers aren’t able to read images, hence they aren’t able to determine what the image is about. However, this can be overcomed by utilising the alt attribute of the img tag to describe the image so that search engines are able to read what the image is about.
On the other hand, I do believe that it may be better to optimise your on-page using actual text than using the image alt attribute on certain situations. How you can do this is by using CSS to replace text that can either be within anchor, header tags or simply text in general with background images . Obviously you don’t want to overdo this (i.e. apply to all images on the page) lest you trigger Google‘s spam alert and also it’s very time consuming!
So is using CSS to optimise your on-page illegal in the eyes of Google? Will you be considered trying to obfuscate the search engines for SEO purposes, hence getting yourself banned from the SERPs? The answer to this is how you do it and the question to ask yourself is, are you trying to be dodgy?
The Way To Get Yourself Banned
In Google Webmasters help under hidden text and links, it is pretty clear what are the criterias to get yourself banned. Although not in the list, I would avoid using text-indent: -9999px to hide your text but rather use display:none instead.
I would say the reason for this is because the text-indent property according to w3c is to be used for text formatting purposes, not visual formatting. However, the display property is used for visual formatting purposes instead which fits the purpose of using it to ‘replace’ text with images as it is a visual aspect.
Algorithmically, Google does not ban websites from the SERPs that use CSS to hide things and obviously would go through some sort of manual review. That’s why it’s important that you ensure you don’t have comments in the source code that reveal your intention of keyword spamming or displaying optimised keywords only for the search engines.
The simply rule to follow is this: if you find yourself questioning whether what you’re doing is spam worthy, then it’s probably spam worthy. What you want to make sure is that you’re using css image replacements with the right intention which is to provide accessibility to users that have CSS disabled and to ensure that search engines are able to read and recognise the important aspects of your page that add value to the user experience.
Using CSS Image Replacement The Right Way
Let’s take a look at Allianz’s homepage, a major insurance provider and how they’ve used CSS replacement the correct way.
You can see on the homepage that navigation menu (highlighted in red box) comprises of image menu items that search engines aren’t able to read. Well, they can actually read it if the image files are optimised with the alt attribute, however I do believe that optimised anchor texts have a greater weighting in SEO than image alt attributes. This use of images is obviously aesthetically more appealing to the user than using normal anchor text.
Looking at the non-CSS version when you disable CSS, you can see below (highlighted in red box) that these navigational menus are actually anchor texts. From a text perspective, this is essentially what search engines see when they’re crawling the site.
When launching a website, it is important to identify where your target audience is located and to ensure that you get found within that region. For example, if you were an e-commerce store in Australia where you only shipped goods locally and not internationally, your target customers would be Australians and hence why it is important for you to localise your website to the local search engines.
The reason for this is whenever a user types in Google.com or Yahoo.com into their browser, these search engines will redirect them to the localised search engines based on IP detection. This means that if you’re browsing from Australia and you type in Google.com, you will be redirected to Google.com.au and for Yahoo.com, au.Yahoo.com. Once you’re at these localised search engines, you have the option of choosing local search results only.
This means that if you are found on these localised search results, the organic traffic that you will be getting highly qualified. This is a win-win for both search engines and website owners as search engines are serving more relevant search results, thus better quality and website owners are receiving relevant localised traffic. It’s all about relevancy!
Now you know the importance of localisation (or geo-targeting), here’s some common factors that search engines look for and factors specific to the major search engines (Google, Bing, Yahoo).