Call 03 85455900

Get the latest search news straight to your inbox!


I have had reason to investigate speeding fine laws recently and I noticed a few people trying to sell information products in the space but it turns out they were targeting the wrong people. In todays video I use two tools that are free and I use all the time for SEO.

1. Google Trends

I use Google trends all the time for keyword research. I’ve used a lot of different tools over the years but for ease of use and accuracy trends wins hands down. In today’s show I was looking at the phrase “Speeding fines” and when I looked at related phrases I saw that “demerit points” was a breakout phrase. When you see this in Google trends it means the phrase has been rising in popularity over time. Quite often you’ll see it for topics that have been in the news a lot.

demerit points

Demerit points breakout

I decided to compare the volume of demerit points searches with that of speeding fines. As you will see in the video I was pretty amazed. Since I recorded I decided to use a control phrase in analysis that I have some idea of the real volume of from my own webmaster tools data. The easiest to use was my own name as it has good volume as I am not the only Jim Stewart around but I rank well for it. In April Webmaster tools tells me there were 1300 searches for my name which was about 2% of volume when compared to “demerit points”.

 

demerit points volume

Demerit points volume

Based on the above data I can make the calculation that there were about 65,000 searches in Australia for the phrase “demerit points” in April vs 17,550 for “speeding fines” . In the case of a site I’m looking at in today’s video, if they were converting traffic at 5% that would mean a difference of  around 2300 sales at $67 or $150k plus. That is not chump change. Guess what though? They don’t rank for “demerit points”  Which brings me to my next tool to quickly analyse your competitors.

2. Aaron Wall’s SEO For Firefox.

I love this tool and have been using it for years. It puts a lot of data in easy reach when you need it. It’s like a toolbox for the SERPs. You can quickly see exactly how strong your competition is for any phrase. I quickly work out in today’s video that it’s actually easier to rank for “demerit points” than speeding fines and that’s where all the money is.

 

{ 0 comments }

You Need To Manipulate Google

by Jim on July 23, 2014


Some people think that it is a bad thing. I would argue it is quite necessary. As you will see from today’s show sometimes Google goes to places you just don’t want it going and I don’t mean emotionally either. At the very least you should be directing Google where you want it to go. One of the easiest ways to do this is to use your robots.txt file. This is a really simple thing to setup even if you are not that technical. The robots.txt file keeps Google out of areas you don’t want it nosing around in.. in theory anyway.

Keep the Google Bot Out

Most good content management systems will come with a pre-populated robots.txt file that all good robots should obey. However there are still plenty of robots that won’t. Google has released a new tool inside webmaster tools so you can test your robots.txt, see today’s video for details.

Myles Harris

Google The Tour Guide

Think of Google as a nosy tour guide wanting to find out everything there is to know about your site so it can show its users. If you give it free reign it will not only access all areas it will index all areas for everyone else to find as well. Whilst it’s bad enough someone finding a version of your website from the 1990s, what’s worse still is Google trying to make sense of it and how it relates to your current site. Stretching the tour guide analogy even further, why would they want to send someone to a confusing destination when there are easier ones to navigate?

{ 0 comments }


If you saw my short cryptic post yesterday about Enrico Altavilla this is the follow up to it. IN today’s show I have been messing around with a technique that he recently wrote about. It’s not only a bit of fun but quite useful as well.

Index me “Any time”

As Enrico pointed out in his excellent post on his G+ profile (I’ve since gone looking for the link but can’t find it. Good one Google) you can further refine the results you get when using the site: search feature. Warning you may find this a little addictive if you are the slightest bit geeky.

When using site: search use the sort by time drop down menu to see what Google has recently indexed on your site. Remember as I point out in today’s video, indexing is not the same as crawling. Google may crawl thousands of pages on your site but index none.

Google indexing by time

Google indexing by time

Google will only index the page if it is new and re-index if the the page if the content has changed. You can also use this tool to see what sites Google is actively indexing in your space. If you are looking at a local audience simply do a site:au search with a key phrase to see what has been most recently indexed in your area of interest.

 

{ 0 comments }

Testing the Google Index

by Jim on July 15, 2014

One of our guys Daniel Laidler,  put me onto a curious post by Enrico Altavilla about using a combination of site: search and the “time”. If you are a geek like me, it is quite addictive. I’ve been messing with it for quite some time now. I thought I’d write this post to check out a few things. This post is a test post for tomorrow’s video post. Check back then for more detail.

{ 0 comments }

If your site has duplicate content — if you use the same wording in more than one place or copy text from elsewhere on the web — then you’re setting yourself up for trouble with Google. Because duplicate content is associated with spam, low quality sites and black hat SEO, it lowers search engine rankings.

You should be aware that duplicate product descriptions have the same negative effect as duplicate blog entires or web page content. It’s not the biggest factor in rankings; on a well built site, content strategy is the largest influence on Google rankings. However, in our experience, removing duplicate content can increase online sales by 10% or more.

SEO Effects on Duplicate Content

The graph above portrays the benefits of removing duplicate content on your site.

Google will detect duplicate content if two URLs point to the same page, and that’s something that we’ll correct in our initial site overhaul. However, duplicate product descriptions remain a common problem, too. They make it harder to rank for long tailed keywords, and using all-original content can improve conversion rates, as well.

When do we see duplicate product descriptions?

Duplicate product descriptions are fairly common, and they’re an easy mistake to make. We most often see them associated with similar products on the same e-commerce site or copied from the manufacturer’s site by resellers. It’s only natural to find the right wording and then use it wherever it applies, right? Well, unfortunately, that doesn’t fly with Google’s ranking algorithm. If you have landing or sales pages for slight product variations such as colour and size, and the pages use some of the same wording, then it can cause a big problem with your SEO.

A case study

We have a StewArt client who was guilty of both mistakes: they were using product descriptions from the manufacturer’s site, and they were using separate pages for different sizes, too. They had site-wide internal and external duplication. To Google’s search algorithm, this made their site look low quality and derivative.

What did we do? We:

  • 1. Switched them over to drop-down menus
  • 2. Had custom content written for the product descriptions
  • 3. Set up Google Publisher

Drop-down Menus

Almost every e-commerce site sells variations on products, whether it’s different shoe colours, different dress sizes or features that can be added on. This client was no exception, and they had placed each variation onto its own sales page. The first thing we did was restructure the e-commerce set-up so that it used pull-down menus rather than separate pages for minor variations on the products. Fewer sales pages is easier on you, easier on your customer and easier for Google’s spiders to catalogue.

Original Content

Once we had consolidated the products, we asked our client to rewrite the product descriptions, aiming the copy directly at their market and their customer base. That’s good marketing as well as good SEO. The original content can be written by you (our client), or contracted out to a content writer or copywriting agency.

Google Publisher

Finally, we set up Google Publisher. Adding a Publisher markup tag (rel=”publisher”) to a website tells Google that the content published there is written by the same publisher managing the Google+ business page. This also helps to protect a site’s ranking; if someone plagiarises your unique content, your site will still rank higher due to the publisher setup.

Results

In the screen shot below, you can see that the client improved their commerce conversion by 8.9%. Similarly, transactions were up by 12.5%. This was entirely due to the steps described above; the only work we did for this client during the time period on the chart was eliminating duplicate content.

Long tailed keywords are keywords that are highly specific, and they are linked to much higher than average conversion rates. Previously, our client’s rankings for long tailed keywords had dropped off. After fixing the duplicate content issue, their ranking for long tailed keywords increased dramatically, thus conversion and sales increased as well. 

So, to summarise:

Strategies:

  • 1. We replaced separate pages for variations on items with consolidated pages and drop-down menus.
  • 2. We replaced copied, repetitive content with custom-written, unique content in the product descriptions
  • 3. We engaged Google Publisher throughout the site.

Results:

  • ✓ Greatly improved rankings for long tailed keywords
  • ✓ Commerce conversion rate +8.9%
  • ✓ Transactions +12.5%

{ 0 comments }

2 Reasons To Forget About Google+ For Business

by Jim July 9, 2014

If you are a bricks and mortar business and rely on what some people used to call Google Maps, Google+ Local or Google places, you may want to forget about putting any effort into a Google+ Page for your business.  If local search is a driver of traffic to your business then you are better [...]

Read more: 2 Reasons To Forget About Google+ For Business

The Art and Science of Landing Page content

by Daniel Laidler July 4, 2014

As a Account Manager I often have conversations about writing a landing page for a particular key phrase (or grouping of key phrases). My approach has refined over the years, the analogies have improved and I would like to write it down for you today, if for no other reason than I can refer to [...]

Read more: The Art and Science of Landing Page content

No webmaster tools? Use these 3 instead.

by Jim July 2, 2014

I’m doing another review this week but I don’t have access to Google Webmaster tools for this website. Quite we’ll get a client who has the same issue. They’re scared of the power their web developer wields and are afraid to ask them for access in case as they don’t want to have the confrontation. [...]

Read more: No webmaster tools? Use these 3 instead.

3 Simple SEO Fixes

by Jim June 25, 2014

This week I take a look at a mate’s site. Sean Callanan of SportsGeek. There are some common SEO issues that I’m seeing in Sean’s site that I see on a lot of sites and in our experience when you fix these things you get a good bump in rankings. 1. Check your sitemap entry [...]

Read more: 3 Simple SEO Fixes

3 Steps To Recover Quickly From Panda

by Jim June 18, 2014

In case you missed it, a couple of weeks ago Google released another major Panda update. If you’re new here Panda is the code name given to an algorithm update that focuses on “quality” issues. I use the term quality loosely as it is what Google is deciding as quality not what you or I [...]

Read more: 3 Steps To Recover Quickly From Panda