Tuesday, August 30, 2016

Building Animated Components, or How React Makes D3 Better

D3 is great. As the jQuery of the web data visualization world, it can do everything you can think of.

Many of the best data visualizations you've seen online use D3. It is a great library, and with the recent v4 update, it became more robust than ever.

Add React, and you can make D3 even better.

Just like jQuery, D3 is powerful but low level. The bigger your visualization, the harder your code becomes to work with, the more time you spend fixing bugs and pulling your hair out.

React can fix that.

You can read my book React+d3js ES6 for a deep insight, or keep reading for an overview of how to best integrate React and D3. In a practical example, we'll see how to build declarative transition-based animations.

A version of this article also exists as a D3 meetup talk on YouTube.

Is React Worth It?

OK, React is big. It adds a ton of code to your payload, and it increases your dependency footprint. It’s yet another library that you have to keep updated.

If you want to use it effectively, you'll need a build step. Something to turn JSX code into pure JavaScript.

Setting up Webpack and Babel is easy these days: just run create-react-app. It gives you JSX compilation, modern JavaScript features, linting, hot loading, and code minification for production builds. It's great.

Despite the size and tooling complexity, React is worth it, especially if you're serious about your visualization. If you're building a one-off that you’ll never have to maintain, debug, or expand, stick to pure D3. If you're building something real, I encourage you to add React to the mix.

To me, the main benefit is that React ~~forces~~ strongly encourages you to componentize your code. The other benefits are either symptoms of componentization, or made possible by it.

The main benefits of using React with your D3 code are:

  • componentization
  • easier testing & debugging
  • smart DOM redraws
  • hot loading

Componentization encourages you to build your code as a series of logical units - components. With JSX, you can use them like they were HTML elements: <Histogram />, <Piechart />, <MyFancyThingThatIMade />. We'll dive deeper into that in the next section.

Building your visualization as a series of components makes it easier to test and debug. You can focus on logical units one at a time. If a component works here, it will work over there as well. If it passes tests and looks nice, it will pass tests and look nice no matter how often you render it, no matter where you put it, and no matter who calls it. 🙌

React understands the structure of your code, so it knows how to redraw only the components that have changes. There’s no more hard work in deciding what to re-render and what to leave alone. Just change and forget. React can figure it out on its own. And yes, if you look at a profiling tool, you'll see that only the parts with changes are re-rendered.

alphabet-redraws

Using create-react-app to configure your tooling, React can utilize hot loading. Let's say you're building a visualization of 30,000 datapoints. With pure D3, you have to refresh the page for every code change. Load the dataset, parse the dataset, render the dataset, click around to reach the state you're testing … yawn.

With React -> no reload, no waiting. Just immediate changes on the page. When I first saw it in action, it felt like eating ice cream while the crescendo of 1812 Overture plays in the background. Mind = blown.

Benefits of Componentization

Components this, components that. Blah blah blah. Why should you care? Your dataviz code already works. You build it, you ship it, you make people happy.

But does the code make you happy? With components, it can. Components make your life easier because they make your code:

  • declarative
  • reusable
  • understandable
  • organized

It's okay if that sounds like buzzword soup. Let me show you.

For instance, declarative code is the kind of code where you say what you want, not how you want it. Ever written HTML or CSS? You know how to write declarative code! Congratz!

React uses JSX to make your JavaScript look like HTML. But don't worry, it all compiles to pure JavaScript behind the scenes.

Try to guess what this code does:

render() {
  // ...
  return (
      <g transform={translate}>
          <Histogram data={this.props.data}
                     value={(d) => d.base_salary}
                     x={0}
                     y={0}
                     width={400}
                     height={200}
                     title="All" />
          <Histogram data={engineerData}
                     value={(d) => d.base_salary}
                     x={450}
                     y={0}
                     width={400}
                     height={200}
                     title="Engineer" />
          <Histogram data={programmerData}
                     value={(d) => d.base_salary}
                     x={0}
                     y={220}
                     width={400}
                     height={200}
                     title="Programmer"/>
          <Histogram data={developerData}
                     value={(d) => d.base_salary}
                     x={450}
                     y={220}
                     width={400}
                     height={200}
                     title="Developer" />
      </g>
  )
}

If you guessed "Renders four histograms", you were right. Hooray.

After you create a Histogram component, you can use it like it was a normal piece of HTML. A histogram shows up anywhere you put <Histogram /> with the right parameters.

In this case, the parameters are x and y coordinates, width and height sizing, the title, some data, and a value accessor. They can be anything your component needs.

Parameters look like HTML attributes, but can take any JavaScript object, even functions. It's like HTML on steroids.

With some boilerplate and the right dataset, that code above gives you a picture like this. A comparison of salary distributions for different types of people who write software.

4 histograms of salary distributions

Look at the code again. Notice how reusable components are? It's like <Histogram /> was a function that created a histogram. Behind the scenes it does compile into a function call - (new Histogram()).render(), or something similar. Histogram becomes a class, and you call an instance's render function every time you use <Histogram />.

React components should follow the principles of good functional programming. No side effects, statelessness, idempotency, comparability. Unless you really, really want to break the rules.

Unlike JavaScript functions where following these principles requires deliberate effort, React makes it hard not to code that way. That's a win when you work in a team.

Continue reading %Building Animated Components, or How React Makes D3 Better%


by Swizec Teller via SitePoint

On Page SEO Checklist for the Optimized Page

On Page SEO Checklist for the Optimized Page

This article is part of an SEO series from WooRank. Thank you for supporting the partners who make SitePoint possible.

The content on your page, and the technical elements behind the scenes, are the most accessible and controllable SEO factors for your site. They’re the best place to start when you’re looking to improve your rankings or redesigning your site. Plus, putting together a page with well-optimized on page elements will help you with your off page efforts.

Since your goal is to make your site as search engine friendly as possible, we’ve got a checklist of how to optimize each on page element, broken up into technical and content optimizations.

Technical On Page Elements

URLS

Optimize your URLs for both human and search engine usability — both affect your SEO. Make sure they clearly show where the page stands in the site’s hierarchy of information so users always know where they are and include the page’s most important keyword. Take WooRank’s SEO guide on link juice for example: http://ift.tt/290glxH. Users can look at that URL and know that:

  1. It’s about link juice.
  2. It’s part of the SEO Guide category, which is part of the Educational category.
  3. The page is in English, and the fact that this is specified is a pretty strong hint that at least one other language option is available.

A more generic URL, like this one for a Banana Republic product page, http://ift.tt/2byyl98, is a lot less useful for users, who can’t see what product or category it’s for.

URL structure is also important to search engines because it helps them understand the importance of a page in relation to the rest of the website. Including keywords in the URL tells them how relevant the page is to a certain topic or keyword. Having well-optimized URLs also helps attract more relevant backlinks since people are more likely to use relevant anchor text for URLs that include keywords.

Title Tag

Title tags are one of the most important parts of on page SEO. Search engines see them as maybe the strongest hint regarding the page’s topic. Put your most important keywords in title tags so search engines can see how relevant a page is to a search. It’s important to write unique title tags for each page since, in theory, all of your pages are different (otherwise you’d have duplicate content).

A well-optimized title tag is no more than 60 characters (with between 50-60 characters as the ideal) and uses keywords at the beginning of the tag. If you want to include more than one keyword, or put your brand in the title as well, separate them with pipes (the | character). A well-optimized title for this article might look like:

<title>On Page SEO Checklist | SEO Best Practices | SitePoint</title>

Be careful when writing your titles, though, because search engines have gotten really good at figuring out when you’re trying to manipulate them. Stuffing your title tag with keywords will end up doing more harm than good: this page crammed its title tag with keywords related to deals on the iPhone 5 and could only get to the 54th page on Google.

Over-optimized title tag

If you’re optimizing for local search, include not only your keyword but also your target location, business and industry.

Meta Description

Search engines sometimes use meta descriptions to help determine the topic of a page, but don’t really use them as a ranking factor. However, they do use them, along with title tags, to form a page’s search snippet. Search snippets are the titles, links and descriptions displayed by search engines in SERPs.

Think of search snippets and meta descriptions as free text ads for your website — use them to entice users to click through to your page. Keywords used in your description will appear in bold when they match keywords in user search queries, so use them when appropriate, and, if possible, include prices and commercial words like cheap, free shipping, deals and reviews to attract in-market users.

Having a high click through rate (CTR) looks good to search engines and will help increase your rank. However, having a high bounce rate will look bad, so clearly and accurately describe the information on your page.

Robots.txt

A robots.txt file is a simple text file, stored in your website’s root directory and specifies which pages can and cannot be crawled by search engine bots. It’s generally used to keep search engines from crawling and indexing pages you wouldn’t necessarily want to show up in search results, such as private folders, temporary folders or your legacy site after a migration. You can block all, none or individual bots to save on bandwidth.

There are two parts to a robots.txt: User-agent and disallow. User-agent refers to the crawler you want to specify. It most often designates a wildcard, represented with an asterisk (*), which means it refers to all bots, but can specify individual bots using user-agent directives. Disallow specifies which pages, directories or files the aforementioned user-agent cannot access. A blank disallow line means the bots can access the whole site, while a slash blocks the whole server.

So to block all bots from the whole server, your robots.txt should look like this:

User-agent: *
    Disallow: /

To allow all crawlers access to the whole site use:

User-agent: *
    Disallow:

Blocking a bot (in this example, Googlebot) from accessing certain folders, files and file types would look like this:

User-agent: Googlebot
    Disallow: /tmp/
    Disallow: /folder/file.html
    Disallow: *.ppt$

Some crawlers even let you get granular by using the ‘allow’ parameter so you can allow access to individual files stored in a disallowed folder:

User-agent: *
    Disallow: /folder/
    Allow: /folder/file.html

Note that robots.txt files won’t work if a bot finds your URL outside of your site. It can still crawl and index that page so if you want extra blocking power, use a robots meta tag with a "noindex" value:

<meta name="robots” content=”noindex”>

Unfortunately, not all bots are legitimate, so some spammers may still ignore your robots.txt and robots meta tag. The only way to stop those bots is by blocking IP access through your server configuration or network firewall.

Sitemap

Sitemaps are xml files that list every URL on a website and provide a few key details such as date or last modification, change frequency and priority. If you’ve got a multilingual site you can also list your hreflang links in your sitemap. A (very simple) sitemap for a site with one page looks like this:

<?xml version="1.0” encoding=”UTF-8”?>
<urlset xmlns="http://ift.tt/xwbjRF” xmlns:xhtml=”http://ift.tt/2bTDKEE;
    <url>
        <loc>http://ift.tt/2byyU2M;
        <lastmod>2016-8-01</lastmod>
        <changefreq>monthly</changefreq>
        <priority>0.9</priority>
        <xhtml:link rel="alternate” hreflang=”fr” href=”http://ift.tt/2bTCP7v;
    </url>

<urlset> shows the current protocol for opening and closing sitemaps. <loc> is the address of the web page and is required. Always use uniform URLs in your sitemap (https, www resolve, etc.). <changefreq> tells search engines how often you update your page, which can encourage them to crawl your site more often. Don’t lie, though, since they can tell when the <changefreq> doesn’t match up with actual changes, which may cause them to just ignore this parameter. <priority> tells how important this URL is compared to others, which helps bots crawl your site more intelligently, giving preference to higher priority pages. <xml:html> tags give URLs to alternate versions of the page.

The <loc> attribute is required since it lists where to find a page while the other values are optional. Learn more about URL attributes and other types of sitemaps in our sitemaps beginners guide.

Canonical URLs

It’s easy to wind up inadvertently hosting duplicate content due to your content management system, syndicated content or e-commerce shopping systems. In these cases, use the rel="canonical” tag to point search engines to the original content. When search engines see this annotation, they know the current page is a copy and to pass link juice, trust and authority to the original page.

Continue reading %On Page SEO Checklist for the Optimized Page%


by Greg Snow-Wasserman via SitePoint

Material Design Select with jQuery

A simple and clean Material Design select created by using jQuery.


by via jQuery-Plugins.net RSS Feed

Spectre: A Lightweight CSS Framework

Frameworks reduce development time for projects considerably. A few of them like Bootstrap are quite popular and offer a lot of features, but you might not need all that for your project. Today, we will focus on a new framework called Spectre. It is lightweight, modern, responsive and mobile friendly. It weighs around 6.8kb when served minified and gzipped. Besides the basic grid system, it also has a lot of other useful various components like tabs, modals and cards etc.

This tutorial will provide a brief overview of this framework, followed by some guidance to help you get started quickly.

Installation

You can either download the minified Spectre.css file directly or use npm and Bower to install it. Once you are done, you can include the file in your project like regular stylesheets.

[code language="html"]
<link rel="stylesheet" href="link/spectre.min.css" />
[/code]

You can also create your own customized version of the framework by editing the Less files in the /src directory or by removing unneeded components from the spectre.less file. Then you can build your CSS file from the command line by using Gulp.

Grid System

Spectre has a flexbox based responsive grid system with 12 columns. The width of each column is determined by its class name. Each class begins with col- and then is followed by a number which represents how many columns wide this particular element should be. For example, col-12 is 12 columns wide which gives it a width of 100% and col-3 is 3 columns wide or a quarter of col-12, which gives it a with of 25%. By default, the different columns will have some gap between them. You can collapse that gap by adding the class col-gapless to their container. Just like Bootstrap, it also offers classes like col-md-[1-12] , col-sm-[1-12] and col-xs-[1-12] to help you control the width of elements when the size of the viewport changes.

It also provides classes such as hide-xs, hide-sm and hide-md to hide elements on specific viewport sizes.

All the columns will show up as a single row when viewport width is less than 480px. The col-xs-* classes will be applicable to all elements with a width greater than 480px. Similarly, col-sm-* will be applicable to viewport width more than 600px and col-md-* will be applicable for viewport width more than 800px.

The following code snippet creates one column with width 33.333% (col-4), two columns with width 25% (col-3) and one column with width 16.66% (col-2).

[code language="html"]
<div class="container">
<div class="columns">
<div class="column col-4">
<div class="col-content">col-4</div>
</div>
<div class="column col-3">
<div class="col-content">col-3</div>
</div>
<div class="column col-3">
<div class="col-content">col-3</div>
</div>
<div class="column col-2">
<div class="col-content">col-2</div>
</div>
</div>
</div>
[/code]

Continue reading %Spectre: A Lightweight CSS Framework%


by Baljeet Rathi via SitePoint

Designhorse

Designhorse specialize in brand identities, packaging design and webdesign. We create, revive and maintain brands to meet the market’s demands.


by csreladm via CSSREEL | CSS Website Awards | World best websites | website design awards | CSS Gallery

Ales Nesetril – Portfolio

A portfolio site for Ales Nesetril, a product designer from Prague, Czech Republic, who focuses on interactive experiences & mobile apps


by csreladm via CSSREEL | CSS Website Awards | World best websites | website design awards | CSS Gallery

M.Sabai – Designer x Coder

Portfolio Website of Michael Sabai, a creative Director, Web Designer, Coder & Photographer


by csreladm via CSSREEL | CSS Website Awards | World best websites | website design awards | CSS Gallery