Some people think that it is a bad thing. I would argue it is quite necessary. As you will see from today’s show sometimes Google goes to places you just don’t want it going and I don’t mean emotionally either. At the very least you should be directing Google where you want it to go. One of the easiest ways to do this is to use your robots.txt file. This is a really simple thing to setup even if you are not that technical. The robots.txt file keeps Google out of areas you don’t want it nosing around in.. in theory anyway.
Keep the Google Bot Out
Most good content management systems will come with a pre-populated robots.txt file that all good robots should obey. However there are still plenty of robots that won’t. Google has released a new tool inside webmaster tools so you can test your robots.txt, see today’s video for details.
Google The Tour Guide
Think of Google as a nosy tour guide wanting to find out everything there is to know about your site so it can show its users. If you give it free reign it will not only access all areas it will index all areas for everyone else to find as well. Whilst it’s bad enough someone finding a version of your website from the 1990s, what’s worse still is Google trying to make sense of it and how it relates to your current site. Stretching the tour guide analogy even further, why would they want to send someone to a confusing destination when there are easier ones to navigate?
If you saw my short cryptic post yesterday about Enrico Altavilla this is the follow up to it. IN today’s show I have been messing around with a technique that he recently wrote about. It’s not only a bit of fun but quite useful as well.
Index me “Any time”
As Enrico pointed out in his excellent post on his G+ profile (I’ve since gone looking for the link but can’t find it. Good one Google) you can further refine the results you get when using the site: search feature. Warning you may find this a little addictive if you are the slightest bit geeky.
When using site: search use the sort by time drop down menu to see what Google has recently indexed on your site. Remember as I point out in today’s video, indexing is not the same as crawling. Google may crawl thousands of pages on your site but index none.
Google indexing by time
Google will only index the page if it is new and re-index if the the page if the content has changed. You can also use this tool to see what sites Google is actively indexing in your space. If you are looking at a local audience simply do a site:au search with a key phrase to see what has been most recently indexed in your area of interest.
One of our guys Daniel Laidler, put me onto a curious post by Enrico Altavilla about using a combination of site: search and the “time”. If you are a geek like me, it is quite addictive. I’ve been messing with it for quite some time now. I thought I’d write this post to check out a few things. This post is a test post for tomorrow’s video post. Check back then for more detail.
If you are a bricks and mortar business and rely on what some people used to call Google Maps, Google+ Local or Google places, you may want to forget about putting any effort into a Google+ Page for your business. If local search is a driver of traffic to your business then you are better off putting your time into the new Google My Business page rather than Google+… even though they look pretty much the same. Personally I find this very cumbersome and annoying and I’m motivated! It must be very difficult for other business owners to get their head around this.
1. Google My Business Is More Important For Local Search
On June 12 of this year Google launched Google My Business in Australia. This replaced the old Google places and or Google + local. It pretty much looks like a Google+ page for business.. but it isn’t. To find yours simply Google your brand. If you have previously set it up you should be able to click on the link that says Google+.. but of course it’s not G+ it’s Google My Business – simple right?
Google My Business Listing
If you haven’t previously set up a Google Places or Google + Local page then you need to go here to setup a Google My Business listing. This is where all your updates and reviews will appear. This is where your reviews will appear as well as general information about your business. There is also a new “introduction” area for you to fill out if you have not updated for a while.
If you are a business that discourages people visiting your physical premises then you may want to ignore a Google My Business Local listing as you have to use a physical address to verify with. If you can’t use a physical address or you are setting up a Google presence for a brand then you need a Google My Business Brand Page.
2. Google+ For Business Will Get Little Exposure In Search
If you have a Google + Page for business, it’s really not worth pursuing it now unless you already have an active large following on it. Google is pushing all activity for local businesses into the Google My Business pages not Google + so you need to move you r activity into those pages. That’s what I’ll be doing.
I’m doing another review this week but I don’t have access to Google Webmaster tools for this website. Quite we’ll get a client who has the same issue. They’re scared of the power their web developer wields and are afraid to ask them for access in case as they don’t want to have the confrontation. Here’s how you can review your own SEO without webmaster tools access.
1. Examine the Google Index.
This is pretty easy to do. Simply go to Google and type site:yourwebaddress.com.au (substitute that for your actual web address!) . How many results do you get? If you have an ecommerce site it should not be very different to the number of products you have plus say 20%. I was talking to one client recently they had about 30,000 pages indexed. They thought this was ok. When I asked them how many products they had had they said 1200. That is a massive discrepancy, even if you have 4 or 5 SKUs for different colours or sizes. You need to work out what are the extra pages that have been crawled. If your site uses www do this search as well. site:yourwebaddress.com.au -inurl:www this will return all the pages Google has crawled at your domain that do not include www. Quite often, like in today’s video you will find sub domains that have been crawled that you really don’t want indexed.
2. Crawl your own site.
Use a crawler like Screaming Frog (Free version available), you can get a very good idea of things like broken links, bad redirections or empty page titles. For SEO Screaming Frog is the best crawler out there I have found. If you use one that you love let us know.
3. Check your backlinks.
There are a number of different backlinks checkers out there but my favourite is Majestic SEO. If you have ever had any SEO done in the past, chances are someone built you backlinks. These days they maybe doing you more harm than good and you really should get rid of any suspect ones. For a free backlink checker you could use something like www.opensiteexplorer.org although I have not used it for quite some time. The reason I like Majestic is that it sorts by anchor text as well which allows you to find the obvious bought spammy backlinks quickly. Got some tools you’d like to share? I’d love to hear about them.