Tuesday, August 30, 2016

On Page SEO Checklist for the Optimized Page

On Page SEO Checklist for the Optimized Page

This article is part of an SEO series from WooRank. Thank you for supporting the partners who make SitePoint possible.

The content on your page, and the technical elements behind the scenes, are the most accessible and controllable SEO factors for your site. They’re the best place to start when you’re looking to improve your rankings or redesigning your site. Plus, putting together a page with well-optimized on page elements will help you with your off page efforts.

Since your goal is to make your site as search engine friendly as possible, we’ve got a checklist of how to optimize each on page element, broken up into technical and content optimizations.

Technical On Page Elements

URLS

Optimize your URLs for both human and search engine usability — both affect your SEO. Make sure they clearly show where the page stands in the site’s hierarchy of information so users always know where they are and include the page’s most important keyword. Take WooRank’s SEO guide on link juice for example: http://ift.tt/290glxH. Users can look at that URL and know that:

  1. It’s about link juice.
  2. It’s part of the SEO Guide category, which is part of the Educational category.
  3. The page is in English, and the fact that this is specified is a pretty strong hint that at least one other language option is available.

A more generic URL, like this one for a Banana Republic product page, http://ift.tt/2byyl98, is a lot less useful for users, who can’t see what product or category it’s for.

URL structure is also important to search engines because it helps them understand the importance of a page in relation to the rest of the website. Including keywords in the URL tells them how relevant the page is to a certain topic or keyword. Having well-optimized URLs also helps attract more relevant backlinks since people are more likely to use relevant anchor text for URLs that include keywords.

Title Tag

Title tags are one of the most important parts of on page SEO. Search engines see them as maybe the strongest hint regarding the page’s topic. Put your most important keywords in title tags so search engines can see how relevant a page is to a search. It’s important to write unique title tags for each page since, in theory, all of your pages are different (otherwise you’d have duplicate content).

A well-optimized title tag is no more than 60 characters (with between 50-60 characters as the ideal) and uses keywords at the beginning of the tag. If you want to include more than one keyword, or put your brand in the title as well, separate them with pipes (the | character). A well-optimized title for this article might look like:

<title>On Page SEO Checklist | SEO Best Practices | SitePoint</title>

Be careful when writing your titles, though, because search engines have gotten really good at figuring out when you’re trying to manipulate them. Stuffing your title tag with keywords will end up doing more harm than good: this page crammed its title tag with keywords related to deals on the iPhone 5 and could only get to the 54th page on Google.

Over-optimized title tag

If you’re optimizing for local search, include not only your keyword but also your target location, business and industry.

Meta Description

Search engines sometimes use meta descriptions to help determine the topic of a page, but don’t really use them as a ranking factor. However, they do use them, along with title tags, to form a page’s search snippet. Search snippets are the titles, links and descriptions displayed by search engines in SERPs.

Think of search snippets and meta descriptions as free text ads for your website — use them to entice users to click through to your page. Keywords used in your description will appear in bold when they match keywords in user search queries, so use them when appropriate, and, if possible, include prices and commercial words like cheap, free shipping, deals and reviews to attract in-market users.

Having a high click through rate (CTR) looks good to search engines and will help increase your rank. However, having a high bounce rate will look bad, so clearly and accurately describe the information on your page.

Robots.txt

A robots.txt file is a simple text file, stored in your website’s root directory and specifies which pages can and cannot be crawled by search engine bots. It’s generally used to keep search engines from crawling and indexing pages you wouldn’t necessarily want to show up in search results, such as private folders, temporary folders or your legacy site after a migration. You can block all, none or individual bots to save on bandwidth.

There are two parts to a robots.txt: User-agent and disallow. User-agent refers to the crawler you want to specify. It most often designates a wildcard, represented with an asterisk (*), which means it refers to all bots, but can specify individual bots using user-agent directives. Disallow specifies which pages, directories or files the aforementioned user-agent cannot access. A blank disallow line means the bots can access the whole site, while a slash blocks the whole server.

So to block all bots from the whole server, your robots.txt should look like this:

User-agent: *
    Disallow: /

To allow all crawlers access to the whole site use:

User-agent: *
    Disallow:

Blocking a bot (in this example, Googlebot) from accessing certain folders, files and file types would look like this:

User-agent: Googlebot
    Disallow: /tmp/
    Disallow: /folder/file.html
    Disallow: *.ppt$

Some crawlers even let you get granular by using the ‘allow’ parameter so you can allow access to individual files stored in a disallowed folder:

User-agent: *
    Disallow: /folder/
    Allow: /folder/file.html

Note that robots.txt files won’t work if a bot finds your URL outside of your site. It can still crawl and index that page so if you want extra blocking power, use a robots meta tag with a "noindex" value:

<meta name="robots” content=”noindex”>

Unfortunately, not all bots are legitimate, so some spammers may still ignore your robots.txt and robots meta tag. The only way to stop those bots is by blocking IP access through your server configuration or network firewall.

Sitemap

Sitemaps are xml files that list every URL on a website and provide a few key details such as date or last modification, change frequency and priority. If you’ve got a multilingual site you can also list your hreflang links in your sitemap. A (very simple) sitemap for a site with one page looks like this:

<?xml version="1.0” encoding=”UTF-8”?>
<urlset xmlns="http://ift.tt/xwbjRF” xmlns:xhtml=”http://ift.tt/2bTDKEE;
    <url>
        <loc>http://ift.tt/2byyU2M;
        <lastmod>2016-8-01</lastmod>
        <changefreq>monthly</changefreq>
        <priority>0.9</priority>
        <xhtml:link rel="alternate” hreflang=”fr” href=”http://ift.tt/2bTCP7v;
    </url>

<urlset> shows the current protocol for opening and closing sitemaps. <loc> is the address of the web page and is required. Always use uniform URLs in your sitemap (https, www resolve, etc.). <changefreq> tells search engines how often you update your page, which can encourage them to crawl your site more often. Don’t lie, though, since they can tell when the <changefreq> doesn’t match up with actual changes, which may cause them to just ignore this parameter. <priority> tells how important this URL is compared to others, which helps bots crawl your site more intelligently, giving preference to higher priority pages. <xml:html> tags give URLs to alternate versions of the page.

The <loc> attribute is required since it lists where to find a page while the other values are optional. Learn more about URL attributes and other types of sitemaps in our sitemaps beginners guide.

Canonical URLs

It’s easy to wind up inadvertently hosting duplicate content due to your content management system, syndicated content or e-commerce shopping systems. In these cases, use the rel="canonical” tag to point search engines to the original content. When search engines see this annotation, they know the current page is a copy and to pass link juice, trust and authority to the original page.

Continue reading %On Page SEO Checklist for the Optimized Page%


by Greg Snow-Wasserman via SitePoint

No comments:

Post a Comment