5 Google Algorithm Updates and How Can Your Website Adapt to it

Google wants to help its users find what they are looking for.  So what’s it got to do with us as web developers? We’ll find out as we drill into 5 Google algorithm updates and what they mean.

We can’t deny that most of us look to search engines like Google for answers.  As web developers, most of you will be familiar with some of these and how they affect websites.

We will tackle 5 Google algorithms, and how websites should adapt to them.

Page Layout Update

As part of Google’s efforts to go after sites that are “top heavy” with ads, Matt Cutts, the giant search engine’s head of search spam, announced last Feb. 6, 2014 via Twitter the latest confirmed update to the Page Layout algorithm. The algorithm was released in Jan. 19, 2012 with the first updated in Oct. 9, 2012, impacting less than 1% and 0.7% of English searches respectively.

So what does the third update entail for web developers and websites? To understand whether it will impact your site or not, let’s first find out what Google means by this latest Page Layout tweak.

Here’s the original quote from Google:

We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away.

So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not very good user experience.

Such sites may not rank as highly going forward.

From what we can glean from Google’s statement, the update is carried out in order to weed out sites with a higher ratio of ads to content. To be even more specific, Google is targeting sites that are top heavy on ads at the top part of the page. This is the first part that is visible to the users before they scroll down.

According to Cutts in his keynote address, “If you look at the top part of your page and the very first thing you see front and center, top above the fold is ads right there, then you might want to ask yourself, ‘do I have the best user experience?”

Google’s latest iteration to the Page Layout algorithm, therefore, is all about improving user experience by catching sites that devalues. These are websites with too heavy ads above the fold when it should have useful and relevant content up there. This is also Google’s way to prioritize websites that have more content above the fold thereby improving user’s overall search experience.

Authorship Removed

In June 2011 Google released the Authorship markup that was supposed to highlight authors who provided high-quality content consistently. Three years later, however, Google Webmaster Tools’ John Mueller announced via Google+ that the search engine wouldn’t be displaying authorship anymore.

The overhaul didn’t come as a surprise though, considering that Google had been showing signs of abandoning the Authorship algorithm previously. This change was a follow-up to Google dropping the author’s photos on June 28, 2011, leaving the bylines untouched for qualified authorship. A month after, August 28, Google announced the complete removal of Authorship markup which was implemented immediately. All authorship bylines were nowhere to be found on all SERPs by the next day.

The Authorship markup was removed because it wasn’t as useful as Google thought it would be. Rather than improve user experience, it distracted from the results. Google also found that there was little to no difference in terms of click-through-rates. The majority of authors or webmasters didn’t implement the Authorship markup either, while the majority of those who chose to, did so incorrectly.

According to Mueller, when tests were done, “removing authorship generally does not seem to reduce traffic to sites. Nor does it increase clicks on ads. We make these kinds of changes to improve our users’ experience.”

Axing this algorithm from search results was Google’s way to provide a more unified experience to its users. But while Google is most people’s search engine of choice, implementing authorship markup is still recommended. This is not for the sake of Google, but for other search engines who take this into account, while also supporting the article’s validity by putting the author’s name on the line to improve article readership.

HTTPS/SSL Update

After months of buzzing about, Google has unveiled its plan to help secure websites a slight ranking boost with its HTTPS/SSL Update. The announcement was in Aug. 6, 2014, five months after Matt Cutts said that he’d want SSL to be a part of the ranking in Google’s overall search algorithm.

Google is making Cutts’ wish a reality by implementing this HTTP/SSL Update which offers minor prioritization for websites with secure HTTPS/SSL encryption. If you’re going from HTTP to HTTPS, you may expect a small ranking boost. The update’s weight is probably just a fraction of what other signals like high-quality content carry, but it has its benefits nonetheless.

Google emphasizes that the update will start out small. The impact according to Google is less than 1% of global searches. The update is probably in its beta phase but Google may have plans to strengthen it if more positive results are seen. It is also the search engine’s way to encourage other webmasters to switch to more secure encryption to keep everyone safe on the web.

Before implementing the HTTPS/SSL Update for your website, however, carefully thinking it through is imperative since the update may put your website traffic at risk.

Panda

Unveiled in 2011, Google’s Panda Update was a way to prevent websites with poor quality content such as scraper sites, content farms and those with shallow content from climbing their way to the top of the search results.

The search filter is updated continually to ensure that sites that weren’t hit before couldn’t escape. There are still some who make the right changes at the right time to bypass the updates, but Google seems like it’s working double time.

Back on May 19, 2014, Google rolled a major Panda update which impacted about 7.5% of English searches. Dubbed as Panda 4.0, the tweak was believed to cover the algorithm as well as a data refresh which definitely caused some mayhem for many low-quality websites.

The updates do not stop with Panda 4.0.  Just four months after the major tweak, Google followed it up with another significant Panda update. The Panda 4.1, the 27th update to the algorithm, rolled out last Sep. 23, 2014. This one involves an algorithm component which is believed to have impacted about 3 to 5 percent of searches. Just like the previous update, Panda 4.1 prioritizes websites with in-depth and well-researched content as well as easy-to-read and lengthy content.

This latest update is believed to be more precise in giving smaller websites with high-quality content a chance to rank better in Google searches. Websites with poor content, however, are penalized from ranking on top of the results.

Read: How to Identify If Your Site Penalized by Google

Penguin

Launched in April 2012, the Google Penguin Update was implemented to catch spammy websites. These are websites that buy links (DoFollow) or use links from networks as their way to boost their Google rankings. This update also covers link farms/FFA links and websites which over optimizes their anchor texts.

Google, over the years, have released five updates to the Penguin algorithm. Penguin 1.0 was launched in Apr. 24, 2012 then updated within the same year twice on May 25 and Oct. 5, 2012. Penguin 2.0 was rolled out on May 22, 2013, and another update came about in Oct. 4, 2013. The last update was given the name Penguin 2.1 which was believed as primarily a data update rather than a major tweak. Some webmasters, however, have cried foul and have been hit really hard.

Ten months down the road after the fifth update, webmasters are complaining. There’s still no update to the Penguin algorithm which has left many websites practically toast despite link cleanup. The lengthy period between the next anticipated Penguin update may be attributed to the fact that Penguin requires a complete rerun according to Mueller.

When the time comes that the Penguin 3.0 rolls out, expect the same jarring and jolting as what happened from the fifth update. Webmasters are urged to prepare themselves and to consider important points such as the quality of their website, a velocity of link building, anchor text, and link sources.

How can Your Website Adapt to the Change?

Driven by its mission to weed out low-quality content websites and offer searchers with the best possible results, Google will continue to roll out updates to many of its algorithms. To keep up with the updates and adapt to the change effectively, below are things to consider:

Prevent Index Bloat

One way to prevent index bloat is to restrict access to dynamically generated pages through Robots.txt or NoIndex tag. You may also restrict indexing to website search results by having a search function on your website.

Implement canonical element.

E-commerce websites which have sorting or filtering functions that lead to dynamically generated pages should implement canonical element.

Canonical functions are similar to the 301 redirects in that you can tailor the application according to your needs. If you must show a page to viewers, for instance, you use a canonical tag. It’s similar to the original one with a slight difference where a copy of a page that has items sorted differently from the original is shown. Otherwise, use the 301 redirects.

Google will still ignore indexing on pages 301 and ends up indexing the destination page. With canonical pages, indexing only happens to the original page.

Make the website more secured by implementing strong HTTPS encryption.

Though the HTTPS/SSL Update is probably still in beta testing, reports showed that Google has seen positive results. If it keeps up, more updates are going to zero in on this aspect of websites. So be sure to make your website more secure by using a strong HTTPS encryption.

Prioritize Content

The fact that “content is king” couldn’t be reiterated more. To easily adapt to Google’s recent and other upcoming updates, design a website that focuses on high-quality content particularly above the fold. That means making your header smaller and sticking with just enough to show the logo of your website, office address, contact number and the navigation links.

More importantly, steer clear of filling the above the fold portion of your pages with too many ads. This does not mean that you should remove all ad slots above the fold but the fewer ad slots you have, the safer your website is, considering Google’s latest update. Since Google is also not very clear with what it penalizes, you should take into account websites that do post ads above the fold. It’s okay unless added in excessive amounts.

This brings about the question of “how many ads are too many?” Take it with a grain of salt. The way you normally use ads above the fold without interrupting or distracting searchers is a good place to start.

Make it easier for visitors to share your content.

In order to attract visitors and adapt to Google changes, make sure your website is shareable through social media platforms. With netizens sharing contents via social networking websites, you are indirectly attracting links. If you’re lucky enough, you may have a reader who liked the content very much. He or she might decide to link to it. Imagine if you have 10,000 followers and at least 10% of them link back to you. That could do wonders for your traffic.

Remember that search engines, and Google, in particular, has taken into account social media signals as part of the ranking factor. In short, the more shareable your site is via social media, the better your ranking potential.

The Bottom Line

We have been trying to boil down SEO so many times, but often fail. First, we tried boiling it down to just link building, but it’s insufficient. Due to the algorithm, Penguin, we then tried boiling it down to making a site accessible, fast, and secure – steering away from link building – but this was still insufficient for making a website perform better on SERP.

What we think is best is to consider all algorithms, and play by their rules if you want to stay on top. Adapt and thrive, or ignore them at your own peril.

Author Bio

Scott Donald is the Head of Strategy at Creative Digital an agency specializing in SEO and custom web development. Follow him on Google+and Twitter.

Leave a Comment