If you have been following the blogosphere and recent news about SEO, you have undoubtedly heard about the now not so recent overhauls in Google’s algorithm. The new search algorithm affected everyone from large financial institutions to small business. Nobody was immune to impunity. However, there were those who were hit hard and blacklisted fallen victim to a bot without being aware of any wrongdoing. Mainly this is because Google’s analytics does not report bot activity and is not concerned with your websites security.
What are Bots?
Bots are programs used commonly in the web for quickly indexing, collecting data and information. Broken down to its core element, bots are a software programmed to perform either legitimate or malicious automated tasks quicker and more efficiently than humans can. It is a fact that three out of five visitors to your site is a bot and as studies suggest they are dominating web traffic.
Google, Bots and SEO a review
Search engine optimization has gone through some serious transformations in the past few years all in efforts of improving a user’s search experience. Google’s Panda dealt with poor quality content and link strategies (Backlinks, Contextual links, Internal Links), Penguin and all the derivatives targeted spamdexing, comment spam and sites who have gained rankings through bad seo techniques such as purchasing low quality links , comment spamming, link farms, and duplicate content.
Along came Hummingbird a fitting name for being quick and accurate. Hummingbird combined the Panda, Penguin updates, and added a few more changes that are mainly content related. These completely overhauled the entire search process making search results more conversational and spam free changing SEO forever.
Some claimed SEO was dead, but those who stuck to White-hat SEO techniques, that followed Google’s guidelines rankings, managed to retain their rankings and improve upon them with the right combination of quality links, quality content and other quality-oriented SEO techniques.
Others were no as lucky. Those who, sometime unknowingly, relied on Black Hat SEO practices, saw their website rankings decline and sometimes disappear entirely from Google Search Pages (SERP). Amongst them were victims of hacking who were unknowingly hosting and distribution malware, which was detected by Google and caused their domain to be blacklisted, suddenly and definitively bringing their SEO traffic flow to a halt.
How can you tell if you have been hacked?
If you noticed a red flag comment when visiting your site or in the search results, “This site may harm your computer” you are a victim. According to the recent study by cloud security experts Incapsula bots have dominated web traffic 61.5% over humans the majority were malicious.
“In the last few years, 130,000 webmasters have taken the necessary steps to get their sites listed again.”
How can you protect yourself?
The key to protection is prevention and to prevent your site from being hacked you need to know that most of todays’ cyber-attackers are not even human. Fact is that very few hackers possess the time and knowledge to manually compromise your website’s data base or engineer a way to steal your administrative passwords. Instead, over 90% of current cyber-threats are actually executed with widely available automated tools, which require no specific proficiencies and offer the benefit of mass-scale malware injection.
Malicious bots, are the agents of such malicious softwares; the automated hackers which today serve as a real threat to most SMB sites and even large-scale organization. This is why we are now seeing a wide adoption of various bot filtering solutions. Which were designed to filter out such “Bad Bots” to prevent scraping, spamming, malware injections and hacking, as well as deter other attacks that may damage your online business.
The Recent study by Incapsula puts this phenomenon in numbers by showing that in 2013 0.5% of all web traffic came from spam bot, 4.5% from hacker bots, and 20.5% from bot impersonators seeking to gain marketing intelligence or to participate in DDoS or more in-depth application layer attacks.
Website security experts at Incapsula note that they now see a significant (75 %) decline in spambot activity, a fact they attribute to Google’s ongoing anti-spam campaign and algorithm changes. On the other hand they also point to overall increase in bot traffic. Their statistics show that, between 2012 and 2013, the amount of bots increased by 21% and became overall, 61.5% of all the web traffic – almost half of it malicious.
3 ways bots affect your SEO ranking
- Spamming – Perhaps the ultimate target in Google’s eyes, spam bots which they attributed to spamdexing; a practice of generating thousands of poor-quality links thought abuse of comments. Today, such ‘link farming’ tactics can cause damage to the spammer’s target, causing its website to be “blacklisted” and totally removed from Google’s search pages.
- Scraping – For many years, scraper bots have been stealing content (plagiarism) and then duplicating it on other sites. Unfortunately, this tactic is a silent offender and can lead to the original site being punished for having duplicate content, regardless if they were the ones who originally created it and aren’t in any way responsible for the duplication. Even if you disable the right click on your site, you only defer human scraping but not robots.
- Hacking or malware injection is a hidden danger. Three out of 5 visitors to your site are bots and can cause damage, penalization or complete loss of traffic. Most websites do not know they are infected until it is too late. Hackers have many different methods of gaining access to your site either through your server as a benign commentator. Once inside they will inject malicious code or a link sometimes in the header of your HTML or anywhere for that matter, it will look no different from your normal html. You will not even notice it; the site will operate as normal, seem functional but hide a hidden threat. If Google detects malware on your pages, it will warn users who click on your pages in the search results before sending them to your site. This will not affect ranking right away, but the warning posted in your site description is just as bad since very few users will willingly enter your site with such a warning.
This is an example of what injected spamcode looks like:
Photo courtesy of WP smashing magazine
For more information on how to deal with a penalty visit google webmaster forums or read this informative article from seo experts at MOZ.