Wednesday, May 31, 2017

How to Advertise on Facebook in 10 Steps

how to advertise on facebook 

Blood, sweat, and tears were spilled, but you’ve finally mastered Google AdWords. Thanks to your hard work, you’re capturing leads left and right. You’ve peddled a warehouseful of widgets.

But you want more. You need more.

You know that Facebook is the logical leap to an even better return on your advertising dollars. To truly dominate Facebook advertising, though, you’re going to need more than just a keen understanding of how the platform works. It takes creativity, preparation, and will.

In fact, the road to becoming a Facebook advertising champion has a lot in common with chasing the opportunity of a lifetime in front of a hometown crowd and triumphing against all odds despite immense adversity and personal turmoil. Cue “Eye of the Tiger”…

 how to advertise on facebook infographic by wordstream

How to Advertise on Facebook, Step #1: Create Your Business Page & Ad Account

Get to know your customers. Be engaging, learn their common interests, learn what keeps them up at night.

This will allow you to create better ads for more targeted audiences that you can nurture effectively. Only. Using. Facebook.

If you’re already advertising on Facebook, check out our Facebook Ad Industry Benchmarks and see how you stack up against your competition.

Skeptical about getting started? Maybe Does Facebook Advertising Work? is more your speed.

The Facebook Advertising Halo Effect

Our research shows that Facebook advertisers see average organic post impressions 225% higher than businesses not advertising on Facebook.

facebook advertising has a positive impact on organic facebook traffic

In addition to the boost to average organic post impressions, Facebook advertisers outperform businesses that aren’t advertising to the average tune of:

  • 77% more page fans
  • 96% more page clicks
  • 126% more page impressions
  • 90% more fans reached
  • 111% more friends of page fans reached

What does this mean? Facebook rewards businesses that spend money by amplifying their unpaid content for free.

How to Advertise on Facebook, Step #2: Add the Facebook Pixel to Your Website

The Facebook Pixel can be as straightforward or malleable as you need it to be: either way, you need it if you want to find out what kind of return you’re getting on your ad spend.

The Facebook Pixel can be optimized for any type of on-site action. It lets you build remarketing lists. If it’s not already on your website, go add it!

Facebook advertising needs the facebook pixel or it just won't work

To learn more about adding the Facebook Pixel to your website, check out The Ultimate Guide to Tracking, Targeting, and Driving Conversions on Facebook by SMM expert Brett McHale.

How to Advertise on Facebook, Step #3: Uncover Your Ideal Audiences

There are nearly 2 billion active Facebook users, and most of them aren’t interested in your product or service (sorry).

Luckily, you can use any combination of geographic, demographic, behavioral, and interest targeting to find the people who are.

Facebook allows you to find potential customers based on virtually any parameter. You can find amateur pugilists in Arkansas or lifelong pacifists who eat cricket chips. You can find your ideal customer.

For inspiration in developing your own audiences, check out our epic infographic on every Facebook Ad targeting option available to you.

What is Relevance Score?

Relevance Score is a measure of the quality of your Facebook ad based on positive and negative feedback from your audience.

facebook advertising relevance score

A higher Relevance Score reduces your cost per click. If your Facebook ads aren’t working, there’s a good chance it’s got something to do with Relevance.

It’s most important when your goals are based on clicks, visibility, brand awareness, brand engagement, or very top-funnel marketing metrics.

How to Advertise on Facebook, Step #4: Pick the Perfect Ad Format

There are more than a dozen ad formats available to you across Facebook and Instagram.

Align your ad creative and copy with your offering and audience. The higher up the funnel (or less familiar with your brand) a prospect is, the less complex (in both format and offering) your ad should be. Make their lives easy.

Grab our Facebook Ad-Type Cheat Sheet so you know which ad format is right for your next campaign.

How to Advertise on Facebook, Step #5: Optimize Bidding & Budget Allocation

On Facebook, the competition is fierce!

facebook advertising industry benchmarks

Assign most of your overall budget to campaigns that can be tied to revenue; while brand-building is important, it doesn’t keep the lights on next month.

Combine targeted audiences and killer creative with what you know about the Facebook auction to bid competitively (within your budget).

To learn more about dominating your prospects’ news feeds, check out How to Compete in Facebook Ads.

The Facebook Auction

In a Facebook ad auction, victory goes to the advertiser with the highest “total value.” Total value is based on three factors:

  • Advertiser Bid
  • Ad Quality + Relevance
  • Estimated Action Rates

These factors will determine how much Facebook Advertising costs your business.

Note that, per Facebook, “you’ll often be charged less than you bid…there’s no advantage to underbidding.”

How to Advertise on Facebook, Step #6: Make Gorgeous Ads (That Convert)

Facebook ads give you the ability to combine great copy with engaging visuals to produce high-converting ads.

Remember: the best way to maximize Facebook and Instagram’s most engaging ad formats, from GIFs to Canvases, is to tell a story.

To learn more about getting started with Facebook advertising, check out the Facebook Creative Hub. It allows you to explore amazing ads in every format available to you on both Facebook and Instagram.

Get inspired, then conceptualize and execute your own exceptional ad creative.

How to Advertise on Facebook, Step #7: Remarket on Facebook

You installed your Pixel so long ago: Now it’s time to cash in.

Leverage your wealth of site-visitor information to turn prospects into customers. If someone downloaded a whitepaper, offer them a demo; did a handful of your customers only buy boxing gloves? Remarket to them with a speed bag.

facebook advertising remarketing ads

To learn more about remarketing on Facebook at every stage of your sales funnel, check out 11 Ways to Turn Prospects into Customers.

How to Advertise on Facebook, Step #8: Eat, Optimize, Sleep, Repeat

Testing is the Facebook ads equivalent of going to the gym: You have to do it if you want to be the best.

Adjust bids, audiences, and creative (visual and copy) often. Maybe weave in some power words. Facebook even allows you to set up A/B testing within the Business Manager UI!

To take your Facebook ads to the next level, try implementing some of our out-of-the-box Facebook targeting strategies.

How to Advertise on Facebook, Step #9: Target New, More Qualified Prospects

Spend less time and money digging through disinterested prospects. We’ve pulled together a few insanely specific Facebook Audiences to show you all just how granular you can get with these things.

Lookalike audiences allow you to find new prospects with attributes that mirror those of an existing audience.

Facebook advertising lookalike audiences

From a single seed audience you can create multiple Lookalikes based on degree of similarity. That’s how you build scale.

No matter how niche your niche is, it’s possible to use lookalikes to whittle the perfect new audience.

How to Advertise on Facebook, Step #10: WIN

Against all odds, you’ve become a Facebook Ads champion. Your campaigns are a knockout!

Keep that championship belt around your waist: Check out the 10 Best Facebook Advertising Features Right Now and stay up-to-date on everything Facebook ads.

About the Author

Allen Finn is a content marketing specialist and the reigning fantasy football champion at WordStream.  He enjoys couth menswear, dank eats, and the dulcet tones of the Wu-Tang Clan. If you know what’s good for you, you’ll follow him on LinkedIn and Twitter.

from Internet Marketing Blog by WordStream http://ift.tt/2rEsINm




from WordPress http://ift.tt/2sdZUbq

Optimizing AngularJS Single-Page Applications for Googlebot Crawlers

Posted by jrridley

It’s almost certain that you’ve encountered AngularJS on the web somewhere, even if you weren’t aware of it at the time. Here’s a list of just a few sites using Angular:

  • Upwork.com
  • Freelancer.com
  • Udemy.com
  • Youtube.com

Any of those look familiar? If so, it’s because AngularJS is taking over the Internet. There’s a good reason for that: Angular- and other React-style frameworks make for a better user and developer experience on a site. For background, AngularJS and ReactJS are part of a web design movement called single-page applications, or SPAs. While a traditional website loads each individual page as the user navigates the site, including calls to the server and cache, loading resources, and rendering the page, SPAs cut out much of the back-end activity by loading the entire site when a user first lands on a page. Instead of loading a new page each time you click on a link, the site dynamically updates a single HTML page as the user interacts with the site.

image001.png

Image c/o Microsoft

Why is this movement taking over the Internet? With SPAs, users are treated to a screaming fast site through which they can navigate almost instantaneously, while developers have a template that allows them to customize, test, and optimize pages seamlessly and efficiently. AngularJS and ReactJS use advanced Javascript templates to render the site, which means the HTML/CSS page speed overhead is almost nothing. All site activity runs behind the scenes, out of view of the user.

Unfortunately, anyone who’s tried performing SEO on an Angular or React site knows that the site activity is hidden from more than just site visitors: it’s also hidden from web crawlers. Crawlers like Googlebot rely heavily on HTML/CSS data to render and interpret the content on a site. When that HTML content is hidden behind website scripts, crawlers have no website content to index and serve in search results.

Of course, Google claims they can crawl Javascript (and SEOs have tested and supported this claim), but even if that is true, Googlebot still struggles to crawl sites built on a SPA framework. One of the first issues we encountered when a client first approached us with an Angular site was that nothing beyond the homepage was appearing in the SERPs. ScreamingFrog crawls uncovered the homepage and a handful of other Javascript resources, and that was it.

SF Javascript.png

Another common issue is recording Google Analytics data. Think about it: Analytics data is tracked by recording pageviews every time a user navigates to a page. How can you track site analytics when there’s no HTML response to trigger a pageview?

After working with several clients on their SPA websites, we’ve developed a process for performing SEO on those sites. By using this process, we’ve not only enabled SPA sites to be indexed by search engines, but even to rank on the first page for keywords.

5-step solution to SEO for AngularJS

  1. Make a list of all pages on the site
  2. Install Prerender
  3. “Fetch as Google”
  4. Configure Analytics
  5. Recrawl the site

1) Make a list of all pages on your site

If this sounds like a long and tedious process, that’s because it definitely can be. For some sites, this will be as easy as exporting the XML sitemap for the site. For other sites, especially those with hundreds or thousands of pages, creating a comprehensive list of all the pages on the site can take hours or days. However, I cannot emphasize enough how helpful this step has been for us. Having an index of all pages on the site gives you a guide to reference and consult as you work on getting your site indexed. It’s almost impossible to predict every issue that you’re going to encounter with an SPA, and if you don’t have an all-inclusive list of content to reference throughout your SEO optimization, it’s highly likely you’ll leave some part of the site un-indexed by search engines inadvertently.

One solution that might enable you to streamline this process is to divide content into directories instead of individual pages. For example, if you know that you have a list of storeroom pages, include your /storeroom/ directory and make a note of how many pages that includes. Or if you have an e-commerce site, make a note of how many products you have in each shopping category and compile your list that way (though if you have an e-commerce site, I hope for your own sake you have a master list of products somewhere). Regardless of what you do to make this step less time-consuming, make sure you have a full list before continuing to step 2.

2) Install Prerender

Prerender is going to be your best friend when performing SEO for SPAs. Prerender is a service that will render your website in a virtual browser, then serve the static HTML content to web crawlers. From an SEO standpoint, this is as good of a solution as you can hope for: users still get the fast, dynamic SPA experience while search engine crawlers can identify indexable content for search results.

Prerender’s pricing varies based on the size of your site and the freshness of the cache served to Google. Smaller sites (up to 250 pages) can use Prerender for free, while larger sites (or sites that update constantly) may need to pay as much as $200+/month. However, having an indexable version of your site that enables you to attract customers through organic search is invaluable. This is where that list you compiled in step 1 comes into play: if you can prioritize what sections of your site need to be served to search engines, or with what frequency, you may be able to save a little bit of money each month while still achieving SEO progress.

3) “Fetch as Google”

Within Google Search Console is an incredibly useful feature called “Fetch as Google.” “Fetch as Google” allows you to enter a URL from your site and fetch it as Googlebot would during a crawl. “Fetch” returns the HTTP response from the page, which includes a full download of the page source code as Googlebot sees it. “Fetch and Render” will return the HTTP response and will also provide a screenshot of the page as Googlebot saw it and as a site visitor would see it.

This has powerful applications for AngularJS sites. Even with Prerender installed, you may find that Google is still only partially displaying your website, or it may be omitting key features of your site that are helpful to users. Plugging the URL into “Fetch as Google” will let you review how your site appears to search engines and what further steps you may need to take to optimize your keyword rankings. Additionally, after requesting a “Fetch” or “Fetch and Render,” you have the option to “Request Indexing” for that page, which can be handy catalyst for getting your site to appear in search results.

4) Configure Google Analytics (or Google Tag Manager)

As I mentioned above, SPAs can have serious trouble with recording Google Analytics data since they don’t track pageviews the way a standard website does. Instead of the traditional Google Analytics tracking code, you’ll need to install Analytics through some kind of alternative method.

One method that works well is to use the Angulartics plugin. Angulartics replaces standard pageview events with virtual pageview tracking, which tracks the entire user navigation across your application. Since SPAs dynamically load HTML content, these virtual pageviews are recorded based on user interactions with the site, which ultimately tracks the same user behavior as you would through traditional Analytics. Other people have found success using Google Tag Manager “History Change” triggers or other innovative methods, which are perfectly acceptable implementations. As long as your Google Analytics tracking records user interactions instead of conventional pageviews, your Analytics configuration should suffice.

5) Recrawl the site

After working through steps 1–4, you’re going to want to crawl the site yourself to find those errors that not even Googlebot was anticipating. One issue we discovered early with a client was that after installing Prerender, our crawlers were still running into a spider trap:

As you can probably tell, there were not actually 150,000 pages on that particular site. Our crawlers just found a recursive loop that kept generating longer and longer URL strings for the site content. This is something we would not have found in Google Search Console or Analytics. SPAs are notorious for causing tedious, inexplicable issues that you’ll only uncover by crawling the site yourself. Even if you follow the steps above and take as many precautions as possible, I can still almost guarantee you will come across a unique issue that can only be diagnosed through a crawl.

If you’ve come across any of these unique issues, let me know in the comments! I’d love to hear what other issues people have encountered with SPAs.

Results

As I mentioned earlier in the article, the process outlined above has enabled us to not only get client sites indexed, but even to get those sites ranking on first page for various keywords. Here’s an example of the keyword progress we made for one client with an AngularJS site:

Also, the organic traffic growth for that client over the course of seven months:

All of this goes to show that although SEO for SPAs can be tedious, laborious, and troublesome, it is not impossible. Follow the steps above, and you can have SEO success with your single-page app website.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog http://ift.tt/2rTk79h




from WordPress http://ift.tt/2rjcTsb

Tuesday, May 30, 2017

5 Hot Blog Design Trends in 2017

From the outlandish aesthetics of the world of fashion to the sleek minimalism of contemporary architecture, design is a fickle mistress.

Although all facets of the design industry move quickly, few aspects of design move quite as quickly – or date quite as badly – as that of web design.

 Blog design trends 2017

Whether you’re launching a new blog or thinking about a blog redesign, you want to be sure you’re implementing a design that’s going to look contemporary and function perfectly across devices.

Today, we’ll be taking a look at the prevailing blog design trends we’ve seen across the internet so far this year. We’ll examine what elements have proven popular and those that have fallen out of favor.

First, let’s take a look at the wider trends in digital marketing that are shaping how blog design looks right now.

Blog Design in 2017: A Mobile-First World

Before we examine the specific design trends that are driving the aesthetics of today’s internet, we need to revisit some data that’s a perennial favorite with marketers – mobile adoption and browsing statistics.

More People Are Using Mobile Devices Exclusively to Access the Web

Here at the WordStream blog, we’ve been banging on about growing mobile adoption for years (along with practically every other digital media blog in the world).

Typically reserved for end-of-year prediction lists, mobile adoption statistics are arguably among the most important data points when it comes to design. Why? Because necessity is the mother of invention, and as more and more people turn to mobile devices to access the web, blogs and other sites are adjusting their look to suit changing tastes.

Take a look at the figure below, taken from StatCounter data:

Blog design trends mobile internet usage statistics 2009-2016 StatCounter 

Worldwide internet usage by device

As you can see, between October 2009 and October 2016, the decline in desktop web access correlates almost perfectly with the growth in the use of mobile devices; as increasing numbers of people use mobile devices to access the internet, fewer people rely on desktops.

This trend is also reflected in time spent per device. According to a recent report published by Comscore, the share of time spent using digital devices has increased across every mobile metric – time spent on mobile devices in general, time spent using mobile apps, and time spent using smartphone apps – while the overall share of time spent using desktops decreased considerably.

Blog design trends share of digital media time by device Comscore 

Share of time spent on digital media by platform

Although marketers have been warning of the fabled “mobile-first” world for years, we only just entered this paradigm toward the end of last year. But what does this mean for blog design?

Balancing the Aesthetic with the Technical

The steady increase in the number of people using mobile devices as their primary means of accessing the web isn’t just interesting from a user experience perspective; it presents unique challenges in balancing aesthetic considerations with the very real limitations of technical specifications.

Blogs Have to Be Technically Lightweight

When we talk about page-load time – a crucial usability metric – it’s typically within the context of conversion rate optimization. The longer your pages take to load, the less likely your visitors are to remain on your site and convert. This has a direct impact on other metrics such as bounce rate and time-on-page.

Blog design trends conversion rates vs page load times 

Conversion rates by load time (via TruConversion)

However, from a user experience perspective, page-load time can have other consequences. For example, did you know that a delay in the load time of a web page can result in an increase in heart rate of 38% – the same increase observed in individuals watching a horror movie? Scary stuff indeed.

Blog design trends mobile page load time impact 

Slow mobile load times cause significant cognitive stress (via Ericsson Consumer Lab)

As Steve Jobs famously opined, design isn’t merely how something looks – it’s how something works. Understanding this is crucial to designing aesthetically pleasing and rewarding web experiences.

So, now that we’ve examined some of the wider trends that are shaping modern blog design, let’s take a look at some of the individual stylistic elements that are making waves in web design right now.

Blog Design Trend #1: The Evolution of Mobile Design

Since we’ve spent so much time talking about the importance of mobile access, it seems only fitting that the first design element we’ll be taking a look at is responsive design.

What is Responsive Design?

Responsive web design (RWD) is a term first coined by renowned web developer Ethan Marcotte (who literally wrote the book on responsive web design) that describes an approach to web design and development that centers around making websites accessible and functional across a wide range of devices.

Blog design trends responsive design examples 

Responsive design exemplified

This is accomplished primarily through the use of media queries, a feature of the Cascading Style Sheets (CSS) 3 language that allows designers to specify that certain CSS elements of a web page be enabled or disabled depending on the resolution or screen size of a user’s device.

Back in 2014 (practically a lifetime ago in web development), responsive design was the tech du jour. Executives with little technical understanding of their own websites asked beleaguered designers to just “make it responsive,” and believed that the cheapest, fastest way to get on the mobile bandwagon was to simply add CSS rules to a static, desktop site.

Is Responsive Design Dead?

Many developers and designers have turned their backs on responsive design. Ethan Marcotte once said that “responsive web design isn’t intended to serve as a replacement for mobile websites,” and this mindset has become increasingly prevalent among the web design community.

Blog design trends RWD performance Guy Podjarny research 

Page size by file type and resolution

One of the main reasons that many developers are rejecting responsive web design is inefficiency and bloat. According to research by web security and performance expert Guy Pordjarny, virtually every responsive site Podjarny studied delivered the full payload of an entire desktop site, regardless of how much data was actually requested by the user. This meant hundreds of additional, unnecessary kilobytes of data were being downloaded, even for minimal server requests. This resulted in terrible performance, user interface problems, and other issues.

Of course, responsive web design isn’t solely responsible for these results – bad websites are bad no matter what device they’re viewed on, plain and simple. This research does, however, highlight how poorly implemented design choices can have a disastrous impact on your site.

Something else to consider is the fact that sites designed and built using RWD will always be slower than native applications. Today, many blogs and websites are designed specifically for various devices, a shift in thinking that puts users first. It may be more work initially, but it’s worth it in the long run.

Progressive Web Apps

Responsive design may be somewhat polarizing in the web design community, but one trend that few designers or developers could argue with is the incredible popularity of progressive web apps.

Blog design trends progressive web apps

Progressive web apps look and feel like native applications and offer many of
the same features, including “Add to Home Screen” functionality.

Progressive web apps are websites that look, feel, and function like apps developed for mobile devices, and offer users an app-like experience in their browser. This includes functionality such as push notifications, accessibility from mobile home screens, offline modes, and other elements that will be familiar to you if you’ve ever used an app on your phone.

Although the nuances of progressive web apps are a little beyond the scope of this post, it’s a fascinating area of web technology (if you’re a nerd like me) that’s definitely deserving of further reading. I’d recommend you check out this content about progressive web apps at the Google developer resource center.

Blog Design Trend #2: Hero Images & Minimalist Design

As bandwidth has increased (along with the processing power of mobile devices), we’ve seen a steady migration away from dense, text-heavy blogs to more visual sites with large, colorful images. One of the most prominent trends in blog design imagery has been the increasing popularity of huge “hero” images – a trend that’s likely to continue for at least the foreseeable future.

Many blogs and sites use this technique to great effect. Take this example from foodie companion site The Infatuation:

Blog design trends hero images example 

You’ve probably seen countless examples of this technique in action as you browse the web. It’s easy to see why the use of hero images has increased so dramatically in recent years. They’re eye-catching, visually striking, and can serve as the basis for entire designs, especially single-page websites with long scroll depth and sites that rely on gorgeous, high-resolution imagery – like food blogs.

Of course, hero images aren’t without their drawbacks. Sites that opt to use this design element must ensure that their images are tightly optimized for technical considerations such as page-load time, and there are unique accessibility concerns to address too, such as correct use of alt text and other metadata fields.

The Growing Trend of Blog Minimalism

Although many websites have firmly embraced large, bold images in their designs, other sites have been quietly advancing another, distinctly different design aesthetic – a truly minimalist approach that eschews images altogether.

Blog design trends Medium-style minimalism 

Popular blogging and web publishing platform Medium (which is owned by Twitter) has been one of the driving forces of this design aesthetic. Yes, Medium does support images, and some Medium bloggers use them to great effect. However, a huge majority of Medium publishers do not use images at all, a trend that some other blogs have seized upon.

Minimal blog designs can be highly effective for certain types of content. Perhaps unsurprisingly, this aesthetic has proven popular among writers, but we’ve also seen several mainstream blogs adopt an image-light design. For some types of blog post – such as opinion-based pieces – this format can work extremely well. For others – such as the kind of instructional content WordStream publishes, for example – not so much.

Blog Design Trend #3: Duotone Color

Color is among the most important design elements you can use, and this year, we’ve seen continued adoption of strong, bold color schemes on many blogs and websites.

In terms of specific trends, the use of duotone color schemes remains popular. Some sites favor complementary duotone color schemes that use two colors that directly complement one another, and few sites have done more to further this trend than music streaming service Spotify:

Blog design trends duotone color scheme example Spotify 

Spotify has been championing the complementary duotone scheme for some time now, and this design trend features prominently across the site and Spotify’s marketing campaigns.

The image above is a powerful example of how this simple yet striking technique can be used to great effect, particularly when paired with strong, minimalist type (more on this shortly). Yes, technically there are more than two colors in play here, but you can see the effect they’re going for.

Duotone color schemes are incredibly versatile and highly effective, despite the limited number of colors in play. There are numerous ways to use duotone color schemes on your blog, so be sure to explore different schemes before settling on a final choice.

Blog design trends color wheel schemes

Various color schemes and their relationships, as depicted on color wheels. Image via Shopify.

Blog Design Trend #4: Strong Typography & Sophisticated Font Pairing

The web may be much more visual than it used to be, but for most sites, text is still the name of the game, making typography one of the most important design elements of any site.

Alongside bold hero images and duotone color schemes, we’ve seen continued use and increasing popularity of strong, sans serif typefaces on many blogs and sites. This example from technology training provider General Assembly illustrates how strong type can enhance even the simplest of designs, a theme that’s consistent across the entirety of General Assembly’s branding:

 Blog design trends sans serif typography General Assembly example

General Assembly’s site design uses distinctive bold typography

Font Pairing

One of the most effective techniques in typography is combining two very different yet complementary typefaces in a single design, a trend that has continued throughout this year and seems poised for at least a little more time in the sun.

Blog design trends font pairing 

Font pairings via Mimpy and Co.

Like many aspects of graphic and web design, font pairing appears a great deal simpler than it actually is. And, like many elements of the best designs, successful font pairings appear effortless precisely because of the skill and thought that went into them. Canva’s Design School offers a helpful guide to font pairing, with explanations of why each combo works.

Blog Design Trend #5: Clean, Simple Layouts

It’s easy to focus on elements such as bold color schemes or striking typefaces, but one of the most prevalent trends across the blogosphere this year has been the continued popularity of clean, simple layouts.

Blog design trends clean blog layouts 

Sidebars crammed with buttons, badges, ads, and clutter are long gone from many blogs. There are a couple of reasons for this.

Firstly, as user experience (UX) design methodologies have filtered down into other areas of design, emphasis has shifted away from stuffing as much crap as possible into your blog sidebar (blogrolls, anyone?), and toward cleaner, more streamlined blog layouts. These principles have also been applied to other aspects of web design, such as site structure and navigation, which also help SEO and discoverability as well as reduce technical overhead.

Blog design trends clean blog layouts by screen size 

How mobile usage has influenced blog design

Secondly, the shift toward mobile has necessitated new blog design approaches that favor speed and performance – both of which can be negatively impacted by extraneous features such as densely packed sidebars – and designs that look and feel great, even on smaller screens.

Blog Design Trends: B2B vs. B2C Websites

It’s worth noting at this point that many of the tips and techniques we’ve covered so far are applicable to most websites. That said, there are distinct considerations that B2B site owners have to reckon with that B2C sites might not.

The first is color. Ever wonder why so many corporate websites use primarily blue color schemes? It’s because the color blue signifies strength and trust – both qualities that many B2B publishers are keen to cultivate. In the figure below, you’ll notice that many large technology companies utilize a blue color scheme in their logos to convey this message:

Blog design trends the psychology of color examples 

B2B websites also have to factor tangible business objectives into their blog design decisions.

For example, many media and news organizations use sidebar content to promote other, related content – think personalized content recommendations based on topic or blog tags. This is a common design element on many sites and encourages readers to remain on the site to browse other content, reducing your bounce rate.

Compare this to WordStream’s sidebar. In the figure below – a screenshot of the main blog page on the WordStream site – you’ll notice that our most prominent call to action is more direct; it encourages readers to grade their AdWords account with the AdWords Performance Grader.

 Blog design trends streamlined sidebar offers

Scroll further down and you’ll see another sidebar element promoting WordStream’s free learning resource, PPC University. Both of these actions are directly tied to business objectives that drive leads and sales and also feed into various nurture pathways. It’s also worth noting that these sidebar elements do not display for every user – below a certain resolution threshold, the sidebar ads disappear in favor of a one- or two-column layout for the content.

The reality is that B2B blogs and sites are often subject to demands that B2C publishers may not be, as in the examples above. However, that doesn’t mean that corporate B2B blogs can’t adopt some of the techniques outlined in this post – you just might have to work a little harder to convince your executive team about certain design choices.

from Internet Marketing Blog by WordStream http://ift.tt/2r6WWHa




from WordPress http://ift.tt/2qxBjwU

Monday, May 29, 2017

Evidence of the Surprising State of JavaScript Indexing

Posted by willcritchlow

Back when I started in this industry, it was standard advice to tell our clients that the search engines couldn’t execute JavaScript (JS), and anything that relied on JS would be effectively invisible and never appear in the index. Over the years, that has changed gradually, from early work-arounds (such as the horrible escaped fragment approach my colleague Rob wrote about back in 2010) to the actual execution of JS in the indexing pipeline that we see today, at least at Google.

In this article, I want to explore some things we’ve seen about JS indexing behavior in the wild and in controlled tests and share some tentative conclusions I’ve drawn about how it must be working.

A brief introduction to JS indexing

At its most basic, the idea behind JavaScript-enabled indexing is to get closer to the search engine seeing the page as the user sees it. Most users browse with JavaScript enabled, and many sites either fail without it or are severely limited. While traditional indexing considers just the raw HTML source received from the server, users typically see a page rendered based on the DOM (Document Object Model) which can be modified by JavaScript running in their web browser. JS-enabled indexing considers all content in the rendered DOM, not just that which appears in the raw HTML.

There are some complexities even in this basic definition (answers in brackets as I understand them):

  • What about JavaScript that requests additional content from the server? (This will generally be included, subject to timeout limits)
  • What about JavaScript that executes some time after the page loads? (This will generally only be indexed up to some time limit, possibly in the region of 5 seconds)
  • What about JavaScript that executes on some user interaction such as scrolling or clicking? (This will generally not be included)
  • What about JavaScript in external files rather than in-line? (This will generally be included, as long as those external files are not blocked from the robot — though see the caveat in experiments below)

For more on the technical details, I recommend my ex-colleague Justin’s writing on the subject.

A high-level overview of my view of JavaScript best practices

Despite the incredible work-arounds of the past (which always seemed like more effort than graceful degradation to me) the “right” answer has existed since at least 2012, with the introduction of PushState. Rob wrote about this one, too. Back then, however, it was pretty clunky and manual and it required a concerted effort to ensure both that the URL was updated in the user’s browser for each view that should be considered a “page,” that the server could return full HTML for those pages in response to new requests for each URL, and that the back button was handled correctly by your JavaScript.

Along the way, in my opinion, too many sites got distracted by a separate prerendering step. This is an approach that does the equivalent of running a headless browser to generate static HTML pages that include any changes made by JavaScript on page load, then serving those snapshots instead of the JS-reliant page in response to requests from bots. It typically treats bots differently, in a way that Google tolerates, as long as the snapshots do represent the user experience. In my opinion, this approach is a poor compromise that’s too susceptible to silent failures and falling out of date. We’ve seen a bunch of sites suffer traffic drops due to serving Googlebot broken experiences that were not immediately detected because no regular users saw the prerendered pages.

These days, if you need or want JS-enhanced functionality, more of the top frameworks have the ability to work the way Rob described in 2012, which is now called isomorphic (roughly meaning “the same”).

Isomorphic JavaScript serves HTML that corresponds to the rendered DOM for each URL, and updates the URL for each “view” that should exist as a separate page as the content is updated via JS. With this implementation, there is actually no need to render the page to index basic content, as it’s served in response to any fresh request.

I was fascinated by this piece of research published recently — you should go and read the whole study. In particular, you should watch this video (recommended in the post) in which the speaker — who is an Angular developer and evangelist — emphasizes the need for an isomorphic approach:

Resources for auditing JavaScript

If you work in SEO, you will increasingly find yourself called upon to figure out whether a particular implementation is correct (hopefully on a staging/development server before it’s deployed live, but who are we kidding? You’ll be doing this live, too).

To do that, here are some resources I’ve found useful:

Some surprising/interesting results

There are likely to be timeouts on JavaScript execution

I already linked above to the ScreamingFrog post that mentions experiments they have done to measure the timeout Google uses to determine when to stop executing JavaScript (they found a limit of around 5 seconds).

It may be more complicated than that, however. This segment of a thread is interesting. It’s from a Hacker News user who goes by the username KMag and who claims to have worked at Google on the JS execution part of the indexing pipeline from 2006–2010. It’s in relation to another user speculating that Google would not care about content loaded “async” (i.e. asynchronously — in other words, loaded as part of new HTTP requests that are triggered in the background while assets continue to download):

“Actually, we did care about this content. I’m not at liberty to explain the details, but we did execute setTimeouts up to some time limit.

If they’re smart, they actually make the exact timeout a function of a HMAC of the loaded source, to make it very difficult to experiment around, find the exact limits, and fool the indexing system. Back in 2010, it was still a fixed time limit.”

What that means is that although it was initially a fixed timeout, he’s speculating (or possibly sharing without directly doing so) that timeouts are programmatically determined (presumably based on page importance and JavaScript reliance) and that they may be tied to the exact source code (the reference to “HMAC” is to do with a technical mechanism for spotting if the page has changed).

It matters how your JS is executed

I referenced this recent study earlier. In it, the author found:

Inline vs. External vs. Bundled JavaScript makes a huge difference for Googlebot

The charts at the end show the extent to which popular JavaScript frameworks perform differently depending on how they’re called, with a range of performance from passing every test to failing almost every test. For example here’s the chart for Angular:

Slide5.PNG

It’s definitely worth reading the whole thing and reviewing the performance of the different frameworks. There’s more evidence of Google saving computing resources in some areas, as well as surprising results between different frameworks.

CRO tests are getting indexed

When we first started seeing JavaScript-based split-testing platforms designed for testing changes aimed at improving conversion rate (CRO = conversion rate optimization), their inline changes to individual pages were invisible to the search engines. As Google in particular has moved up the JavaScript competency ladder through executing simple inline JS to more complex JS in external files, we are now seeing some CRO-platform-created changes being indexed. A simplified version of what’s happening is:

  • For users:
    • CRO platforms typically take a visitor to a page, check for the existence of a cookie, and if there isn’t one, randomly assign the visitor to group A or group B
    • Based on either the cookie value or the new assignment, the user is either served the page unchanged, or sees a version that is modified in their browser by JavaScript loaded from the CRO platform’s CDN (content delivery network)
    • A cookie is then set to make sure that the user sees the same version if they revisit that page later
  • For Googlebot:
    • The reliance on external JavaScript used to prevent both the bucketing and the inline changes from being indexed
    • With external JavaScript now being loaded, and with many of these inline changes being made using standard libraries (such as JQuery), Google is able to index the variant and hence we see CRO experiments sometimes being indexed

I might have expected the platforms to block their JS with robots.txt, but at least the main platforms I’ve looked at don’t do that. With Google being sympathetic towards testing, however, this shouldn’t be a major problem — just something to be aware of as you build out your user-facing CRO tests. All the more reason for your UX and SEO teams to work closely together and communicate well.

Split tests show SEO improvements from removing a reliance on JS

Although we would like to do a lot more to test the actual real-world impact of relying on JavaScript, we do have some early results. At the end of last week I published a post outlining the uplift we saw from removing a site’s reliance on JS to display content and links on category pages.

odn_additional_sessions.png

A simple test that removed the need for JavaScript on 50% of pages showed a >6% uplift in organic traffic — worth thousands of extra sessions a month. While we haven’t proven that JavaScript is always bad, nor understood the exact mechanism at work here, we have opened up a new avenue for exploration, and at least shown that it’s not a settled matter. To my mind, it highlights the importance of testing. It’s obviously our belief in the importance of SEO split-testing that led to us investing so much in the development of the ODN platform over the last 18 months or so.

Conclusion: How JavaScript indexing might work from a systems perspective

Based on all of the information we can piece together from the external behavior of the search results, public comments from Googlers, tests and experiments, and first principles, here’s how I think JavaScript indexing is working at Google at the moment: I think there is a separate queue for JS-enabled rendering, because the computational cost of trying to run JavaScript over the entire web is unnecessary given the lack of a need for it on many, many pages. In detail, I think:

  • Googlebot crawls and caches HTML and core resources regularly
  • Heuristics (and probably machine learning) are used to prioritize JavaScript rendering for each page:
    • Some pages are indexed with no JS execution. There are many pages that can probably be easily identified as not needing rendering, and others which are such a low priority that it isn’t worth the computing resources.
    • Some pages get immediate rendering – or possibly immediate basic/regular indexing, along with high-priority rendering. This would enable the immediate indexation of pages in news results or other QDF results, but also allow pages that rely heavily on JS to get updated indexation when the rendering completes.
    • Many pages are rendered async in a separate process/queue from both crawling and regular indexing, thereby adding the page to the index for new words and phrases found only in the JS-rendered version when rendering completes, in addition to the words and phrases found in the unrendered version indexed initially.
  • The JS rendering also, in addition to adding pages to the index:
    • May make modifications to the link graph
    • May add new URLs to the discovery/crawling queue for Googlebot

The idea of JavaScript rendering as a distinct and separate part of the indexing pipeline is backed up by this quote from KMag, who I mentioned previously for his contributions to this HN thread (direct link) [emphasis mine]:

“I was working on the lightweight high-performance JavaScript interpretation system that sandboxed pretty much just a JS engine and a DOM implementation that we could run on every web page on the index. Most of my work was trying to improve the fidelity of the system. My code analyzed every web page in the index.

Towards the end of my time there, there was someone in Mountain View working on a heavier, higher-fidelity system that sandboxed much more of a browser, and they were trying to improve performance so they could use it on a higher percentage of the index.”

This was the situation in 2010. It seems likely that they have moved a long way towards the headless browser in all cases, but I’m skeptical about whether it would be worth their while to render every page they crawl with JavaScript given the expense of doing so and the fact that a large percentage of pages do not change substantially when you do.

My best guess is that they’re using a combination of trying to figure out the need for JavaScript execution on a given page, coupled with trust/authority metrics to decide whether (and with what priority) to render a page with JS.

Run a test, get publicity

I have a hypothesis that I would love to see someone test: That it’s possible to get a page indexed and ranking for a nonsense word contained in the served HTML, but not initially ranking for a different nonsense word added via JavaScript; then, to see the JS get indexed some period of time later and rank for both nonsense words. If you want to run that test, let me know the results — I’d be happy to publicize them.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from Moz Blog http://ift.tt/2qxWjmk




from WordPress http://ift.tt/2rekYjN