Thursday, September 22, 2016

5 SEO Guidelines for Web Developers

SEO Guidelines for Web Developers

This article is part of an SEO series from WooRank. Thank you for supporting the partners who make SitePoint possible.

If you’ve decided that you are going to rely on organic search engine results (as opposed to paid search traffic or display advertising) as the primary driver of traffic to your website, you need to take that into account when coding pages. SEO is about much more than keywords, synonyms and content marketing — there are a lot of technical aspects going on behind the scenes that help determine where a page ranks in search results.

The first step is to make sure your page is accessible to search engines, and that their robots can see the page content. In Google Search Console, use Fetch as Google in the Crawl section to see how your page appears to search engines. Remember, crawlers can’t access iframes and are limited when indexing content in Flash or Silverlight, so if you’ve got important content, keep it in HTML.

Fetch as Google

Once your site basics can be seen, crawled and indexed by search engines, use these guidelines to make sure robots can properly figure out what your pages are about, how they relate to keywords and what sort of user experience they will provide.

Write URLs for SEO

Clean URLs

A page’s URL is an integral part of its user experience and SEO. In fact, it’s the first thing search engine crawlers see and, ideally, it tells them a lot regarding the page and its content. That means your URLs need to be clean, easy to read, descriptive and free of URL parameters. Take two URLs for example:

http://ift.tt/2cUHdTk

http://ift.tt/2d7i8rr

The first URL has unnecessary parameters that are likely to confuse robots and people since they can’t see what category or product the page is for. In this case, people are much less likely to share or click on this URL, and search engines will have trouble determining its relevance to a keyword. The second URL is much more preferable. It’s easier to read, tells you what category and product you will find on the page and doesn’t contain any confusing parameters or query strings.

You often wind up with URL parameters due to analytics and tracking programs, and when your CMS serves dynamic page elements like filters and sorting. If you’re using an advanced CMS, like WordPress, you can rewrite your URLs by changing the permalink settings in the admin main menu.

Permalink feature in WordPress

Optimized URLs

The structure and words you use in your URLs are also very important for SEO. The URL’s path helps search engines understand the page’s relationship and importance to the rest of the site. The words used in the URL tells them how relevant that page is to a particular topic or keyword. A well-optimized URL structure has the following elements:

  • Short and descriptive: Ideally your use of keywords will describe the page content. If, for whatever reason, you don’t use keywords in your URLs, keep the path as efficient as possible. Use as few words as possible and avoid stop words (the, a, of, on, etc.) altogether.
  • Hyphens instead of underscores: When separating words in URLs, always use hyphens. Search engines use hyphens as word separators, so they are able to recognize urls-written-like-this. They don’t use underscores to denote anything, so they don’t recognize them. That means they’ll see urls_written_like_this the same as urlswrittenlikethis. If you write URLs like that, they’ll really struggle to interpret them and recognize keywords.
  • Keywords used at the beginning: Put your most important keywords at the beginning of the URL. Search engine crawlers assign more value to these words. This is another reason to keep your URLs short: The fewer words in the URL, the more value search engines place on each one. However, it’s absolutely vital that you use keywords naturally. Otherwise your page could come across as low quality or spammy. If you’re targeting a longtail keyword, consider removing your category and sub-category names to keep your URLs short.

Optimizing URLs using keywords also makes it more likely that the anchor text for your links will use relevant keywords.

Meta Tags

Your code is important not just because it creates a quality page for users. Search engines also look at meta tags to learn things about your page. Even if you don’t write your meta tags yourself (this is often done by marketers), you should still understand how they work for SEO. There are three meta tags that are especially important for SEO:

Title tag: The title tag is one of the most important on page SEO signals. They are perhaps the strongest hint you can give to search engines about your page’s topic. Therefore, use your most important target keyword at the beginning of the title so search engines can see if the page is relevant to a given search. A well-optimized title tag is no more than 60 characters, including spaces and punctuation, with 50-60 being the ideal length. If you use more than one keyword or include your brand, separate them using pipes (the | character). Your title tag should look like:

<title>SEO Guide for Developers | SEO Best Practices | SitePoint</title>

If you’re optimizing for local search, use your target location, business and industry in your title tag as well as your keyword. So your local title tag might be something like <title>Smith & Sons | Construction | Toledo</title>.

Meta description: Meta descriptions aren’t used directly as a ranking signal, but they are still important for SEO. Search engines still sometimes look at them to help determine a page’s topic and they’re combined with title tags to form your search snippet. Search snippets are the title, link and page description displayed in search results. They essentially work as a free text ad for your page, with keywords matching search queries displayed in bold. Having a clear and accurate meta description will help increase click-through rate (CTR) and decrease bounce rate, both of which look good to search engines and can help improve your rank. If relevant, include words like "cheap,” “sale,” “free shipping,” or “reviews” to attract in-market searchers. The meta description tag looks like this:

    <meta name="description” content=”Your short page description, no more than 160 characters.”/>

Robots: The robots meta tag is used to tell search engine crawlers if they can or cannot index a page or follow the links on that page. This meta tag will keep search engines from indexing pages they find by following links on other sites, which would not be prevented by your robots.txt file. The robots meta tag looks like this:

    <meta name="robots” content=”noindex”/>

You can also prevent search engines from following the links on your page by adding the "nofollow" value to the content attribute. This would be advisable if your page has a lot of links that you don’t really want to pass value, or if your page includes several paid links via native marketing. A robots meta tag using “nofollow” would look like this:

    <meta name="robots” content=”noindex, nofollow” />

Note that disallowing pages using the robots.txt file does not negate your need for a robots meta tag. While Google won’t index these pages, they may still show them in the search results, replacing the meta description with ‘A description for this result is not available because of this site's robots.txt’. If you’re using the meta robots noindex tag, make sure you don’t also disallow the page in your robots.txt file, as this will prevent the crawlers from ever seeing it.

Redirects

Developers need to move content around a site all the time, often hosting it at a new URL and setting up a redirect to send visitors to the new page. Redirects are good for your SEO because search engines like when there’s one canonical version of something. So if you have two or more paths to get to the same destination, like if you’ve temporarily moved content to a new folder or copied pages to a subdomain, they tend to get a little confused and will treat your pages as duplicate content.

Using redirects on your old pages pointing to your new pages will make sure that users not only wind up in the right place, but that search engine spiders do too. If you don’t use redirects, you risk search engines serving the wrong page in search results, and assigning trust and authority to outdated URLs.

One of the biggest benefits of using 301 (permanent) and 302 (temporary) redirects is that they pass full link juice on to the destination page. This allows you to move content without suffering much in terms of ranking and traffic. It’s better for users as well, because they won’t have to deal with dead links and 404 pages.

Note that until relatively recently, it was SEO best practice to use 301 instead of 302 redirects because the latter didn’t pass link juice. That’s no longer the case. Google treats 302 redirects as if they were 301s and passes full PageRank to the destination page. Also, avoid using more than one redirect in a row. Search engines really don’t like redirect chains, and it’s really inefficient for your server as well.

If you’re planning to do a site migration where the URL strings will remain the same, use the .htaccess rewrite methods shown above to save you time.

Continue reading %5 SEO Guidelines for Web Developers%


by Sam Gooch via SitePoint

No comments:

Post a Comment