Technical SEO

It’s a common problem. You’ve spent a lot of time and money getting your website to look just the way you want it to, but you’re not appearing on the first page of the search results. Sure, your AdWords campaign is bringing in an audience, but it’s costing you a lot of money. Why aren’t you getting any of that invaluable free organic traffic? Here’s 7 simple ways to check if you’ve got the basics set up correctly to maximize your organic search performance.

1. Title Tags – Do you have unique title tags for every page? A title tag is the blue header text displayed in the Google search results and your very first chance to grab attention and win that click. As you can see below this is an example of a good title tag. The title tag should make direct reference to what is on that particular page and should be more granular the deeper you go into your site. There is an easy way to check if you have unique title tags on each page:

  • Load 3-5 different pages on your website
  • Right click on each page and select ‘View Page Source’ (or Control U as a short cut)
  • Use find (Control F) to search for ‘<title>’ and you’ll be able to see the title tag. This will appear near the very top of the page and you should not have any two pages with the same title tag

Why is this important? Title tags allow Google to quickly identify what is on that page. If every page on your site has the same title tag it makes it much harder for Google to identify the relevancy of pages and know when to return a page in the search results.

Google Search Result:

title and meta2. Meta Description – Similar to the title tags, the meta description provides search engine users with information on what they can expect to find by visiting your page. It’s important that each of the pages on your website features a unique meta description that is relevant to each individual page. You can go through the same process that you used for title tags to see if your site has unique meta descriptions. Having a meta description that features a general description about your company may be good for the home page, but displaying that same generic text in the search results when someone is after a specific product is going to give you a lower click-through rate. Getting your meta descriptions right is not as mission-critical as getting your title tags correct, but ultimately a bad meta description can negatively affect your chance of getting the click.

Website Source Code:

title and meta code

3. XML Sitemap – I can’t stress enough just how important it is to have an XML Sitemap is. To index the web, Google sends out website crawlers that spread out over the internet, capture information, and record it in their (very big!) database. When someone conducts a search, Google queries their database to find the most relevant results to populate the search results. The XML Sitemap makes it as simple as possible for Google to catalogue all the information on all your pages. It doesn’t directly affect your rankings, but it does make sure that Google knows your content exists – and after all, if your content is invisible to Google then it won’t be featured in the search results. How to find out if you have a XML Sitemap? Simply put your website URL in this free tool here – http://seositecheckup.com/ and check the section named “Sitemap Test” (note you can only run one scan every 2 hours without registering).

4. Robots.txt – This little bit of code (known as The Robot Exclusion Protocol) can be very useful, but if implemented incorrectly it can also be very dangerous. Before Google visits your page it looks for this bit of code to tell it which pages to not index in to the search results, amongst other things. These can be pages such as admin logins, private pages and other pages that you don’t need indexed, which can help boost your website efficiency by reducing your index bloat. However, if your robots text file is accidentally telling Google to “Disallow: /”, then effectively your telling Google it’s not allowed to visit your website! So be careful! How to check if you have robots.txt installed? Simply input your URL here – http://seositecheckup.com/tools/robotstxt-test (note you can only run this every 30 minutes without signing up). You may want to ask someone with robots.txt experience to check your code to make sure it isn’t inadvertently blocking content that you do want to have indexed.

5. Rel=canonical – The rel=canonical link element helps search engines deal with duplicate content issues. It does this by telling search engines which version of a duplicate page is your preferred version of a page. By telling search engines which is the preferred version of a page, it helps them to decide which version of the URL should be displayed in the search results and also helps to concentrate your ranking signals onto one page. As a result, this can improve your performance in the organic search results. Duplicate versions of a page can be inadvertently created by a number of factors, including the categorized structure of ecommerce websites, the use of tracking parameters in a URL, and through URL parameters that affect the way content is displayed, but not the content itself (eg sorting and filtering parameters). As such, every page on your website should include the rel=canonical link element, indicating to search engines which is your preferred version of your URL. How to check for rel=canonical? Simply view the page in the source code (Control U) and use find (Control F) to locate rel=canonical.

6. URL Structure – Ideally, the URL of the page should have relevancy to the content that is on that page. This not only helps your visitors understand what the page is about, but is also used by search engines as a ranking factor in their algorithm. These are the key factors that you can check to see if your site is structured correctly:

  • Try to use keywords in the URL where possible and make your URL as similar to your title tags as possible
  • Make your URLs easily readable so a person (and Google) can tell what is on that page.
  • Use hyphens (-) instead of underscores (_) to separate words when possible.
  • Exclude dynamic parameters where possible and instead use static URLs.
  • Try to keep it to a single domain and then one subdomain.

7. Google Place Listing – If you are a business with a physical address then you need to claim your local listing. Doing this ensures accurate information is displayed to those people who are searching for a business like yours in their area. This is invaluable in driving local customers to your business and gives you the best chance to get displayed at the very top of the search results.

 

local listingsThese 7 factors can have a significant impact upon the way your website displays in the organic search results. Without these aspects properly taken care of, you’re really missing the foundations of your SEO, so anything else you do from an SEO perspective may struggle to generate results. It’s vital that your website is effectively SEO optimized to make it easy for search engines to locate, index, and rank your content for the relevant search terms. Google didn’t get to be the best by guessing what sites to show; their algorithm is finely tuned towards returning the best website for any given search query.

Make Google’s job easy and get the basics right. The rewards can be tremendous.