DigiBiz Consulting

Category: SEO

One of the hotly debated topics over the past few years have been whether social signals (e.g. social media Likes, shares, and so on) contribute toward a site’s higher ranking in SEO. A few years ago, staff from Google apparently stated that social signals were not used as a ranking factor for web pages. The thought was based on the following:

  • It’s easy for anyone to create fake social media profiles and post links to those pages. With so many social media platforms and the possibility of many fake accounts makes Google wary of using those as valid back links.
  • The authority of social media pages where links are shared isn’t easily established as it can be for regular web pages and websites.

Due to these and some other concerns, the understanding in the SEO community is that social media signals don’t contribute (or not as much) to a site’s organic search ranking and SEO.

However, during testing, some web pages with many shares across social media have shown improvements in their organic search rankings. But that may be because of the indirect effect of sharing. Links that are shared extensively over social media get a lot of exposure and are much more likely to be picked by someone and included on their website or blog.

The bottom line is that sharing on social media is encouraged and the more social signals you pick up, the better it will be for your brand, inbound traffic, and has the potential to get back-links if some from those who see your link decide to link to your web page from their site.

Back to the SEO corner

Click here to take online courses on SEO and search engines

Webmasters should be wary of factors that not only can negatively impact a user experience but also a site’s SEO parameters that in turn can impact its position and ranking within organic search and SERPs. The following are some of those factors:

Black hat SEO techniques to Avoid

  1. Keyword stuffing – Some webmasters stuff their web pages with keywords hoping to improve search engine rankings. Although this may have worked in the past but search engines today have become much more intelligent to allow such types of practices. The rule of thumb is that if it’s not a good read for the users, then it won’t be good for the search engines and the site’s and / or page’s overall rankings.
  2. Spamming blogs with comments – Many black hat online marketers until today spam the comments sections of many blogs hoping to get backward links. This definitely is something that won’t work.
  3. Doorway pages – Some webmasters implement doorway pages on which they get traffic only to direct that traffic to the other sites. Search engines don’t look at such techniques favorably.
  4. Duplicate content – Although it’s alright on occasions to reference content from other sources, completely copying and duplicating content can land the site in trouble, especially if it’s done persistently. Websites with original content are much more likely to be placed higher in search engine rankings than those with duplicate content.
  5. Over use of anchor texts – sites that over use anchor text are more likely to be picked up by search engines as being unnatural. For example, repeatedly using the anchor text “SEO tips” from around the websites is unnatural. Instead, use natural expressions as anchor texts to point to relevant webpages.
  6. Use of hidden texts – Some webmasters hide texts and keyword phrases on web pages thinking of not being caught. They would make use of either small fonts or invisible colored fonts to hide texts from the users. What they don’t realize is that web crawlers can easily detect such black hat techniques.
  7. Cloaking – This refers to presenting one set of content for search engines but when the user clicks on a link, it instead redirects the user to a bad or spammy site. Although search engines have built in the necessary intelligence to detect such techniques, it’s unfortunately being used extensively by many in the industry.
  8. Links coming from link farms – Some webmasters post links in link farms hoping that link backs from such sites will improve their search rankings. On the contrary, search engine crawlers have become intelligent to weed out such link neighborhoods and can impact the trustworthiness of your site.
  9. Content or Article Spinning – This refers to taking an existing article and rephrasing it to make it look like an original content. Again, search engine algorithms today have built in a lot of intelligence related to understanding and interpreting natural languages and their syntax.
  10. Clickbaiting –  In this technique webmasters create a headline that has nothing to do with the actual content. The idea is to try to get clicks on the link. Even though it may work in the short run, search engines will quickly detect such techniques when they analyze the page’s bounce rate and low session duration.

Conclusion

It’s clear that employing these black hat SEO techniques involve a lot of work and effort. Webmasters and marketers are better off investing the same times and effort on creating useful content and building a good online presence that can add to the overall user experience. These techniques, even if they are successful for the short term, eventually get caught with undesirable consequences such as dropping of ranking overnight, getting websites banned or delisted from search engines, and much more.

— End

Back to the SEO Corner

What is the Google Search Index?

When a new site and its pages are created, it’s information must be included in the Google database so that when users search on Google, those pages can be discovered and displayed in the SERPs (Search Engine Results Pages). This Google database of sites and information on their pages is referred to as the Google Search Index.

google index

How is the Google index updated?

The Google Search Engine has crawlers (also referred to as the Googlebot) that crawls the web looking for any updates to the overall sites located in the cyberspace. These updates include new sites and their pages added, sites and their pages deleted, and updates to existing sites. The Googlebot collects this information and updates the Google Search Index with this information.

Each search query results in Google finding potentially millions of relevant pages and results. These pages are ranked and displayed in the organic search results to the end user using sophisticated search algorithms. However, before search engines and algorithms can do their magic, the Google web crawlers have to find and organize those web pages in Google’s search index. With the ever increasing amounts of information on the web, the Google Search Index is growing exponentially. Just imagine that besides all the web content that is available, the Google index has started to include pages from millions of already published books. This has made the Google index to swell to trillions of pages and is still growing. Organizing this information and then sorting through it and to rank pages is no small feat. This is where Google crawlers and search algorithms do their job.

The Googlebot is able to find all this information using website links. It follows one link to another until it’s able to get to all the websites and their pages. To be able to crawl and read the information embedded in each of the website pages, the Googlebot must be able to read the information properly. To facilitate this, Google publishes best practices and guidelines to allow the Googlebot to easily index the web pages. Although there are some technicalities to these practices, for the most part if a website is structured well for the end users, the Googlebot has no problems in reading information related to the sites and to add that information to its index.

How can Google (and Googlebot) rapidly index and rank a website and its pages?

Although there are no is no specific science that will guarantee an update and search ranking in a certain amount of time, Webmasters can take a number of steps to try to come up on the search map quickly. Here are some of those ways.

Get a Google Search Console Account

Google Search Console is a service provided by Google that can help webmasters monitor the performance of their websites in the Google search engine. A similar service is also provide by Bing as well. Although signing up for Google Search Console is not a requirement to be included in Google’s organic search results but it’s highly recommended because it allows the webmaster to monitor the performance of the site related to inclusion in the Google’s search index. Webmasters can view performance related to the search terms that drive traffic to a website, any crawl errors that the Googlebot may be experiencing due to any technical issues, HTML improvements from Google, and so on. Here are some of the primary functions that are available to a webmaster to help manage their website’s inclusion and positioning within Google Search:

  • Overall status of a website with Google Index. This includes information related to crawl errors, search analytics information, sitemap related information, etc.
  • HTML improvements recommended by Google
  • Status related to Accelerated Mobile Pages if the website is using the feature
  • Links to a website (internal and external)
  • Information related to structured data that let’s webmasters provide specific structured information to Google to ease in Google indexing
  • Any resource (web pages, files, etc.) that are perhaps blocked to be included in the Google index.
  • URL removal requests – This refers to URLs that may have been requested to be removed from the Google Index.
  • Site crawl errors that may be impeding webpages to be includes in Google’s index
  • Crawl Stats that includes information related to Pages crawled per day, webpages downloaded per day in kilobytes, and time spent downloading a page (in milliseconds).
  • Security issues – This includes information if a website is hacked and if someone has installed Malware on the site. This prevents the website’s actual pages to show up in the index or the pages are directed to other sites (stealing traffic).

The images below show some of the information related to the functions above.

Google index quick update

Google index update

Google index HTML improvement SEO

Internal  Linking

Internal links not only allows Googlebot to get to your internal pages quickly but it’s also able to assess the importance of these pages relative to others. Obviously, if a web page is more referenced (and hence internally linked) that means it’s highly valued by the website owner.

Ensure a good website structure

Similar to the earlier point, the existence of an overall structured website makes the job of the Googlebot much easier and it’s able to determine not only the structure of the site but also it can get to those pages quickly in a short span of time. The easier you can make the site accessible to users (and Googlebot), the sooner it can index those pages and make them available for organic search.

Include buttons for social shares

Another way is to include social sharing buttons on your website. This allows your readers to share content on social media sites that they like on your website on other pages thus making them more visible to Googlebot.

Get external links

As mentioned earlier, Googlebot is able to find pages through various links published either internally or on external websites. Imagine, if an external popular website that has Googlebot visiting it frequently, finds the link to a new website, then Googlebot will add it to its index rather quickly. Besides the inclusion of a link on an existing popular website also in an indication of the quality of the link and Google search uses this to rank it higher for the relevant keywords.

Website promotion

As part of their duties, webmasters should constantly be promoting a website to various directories and other places on the web. Again, the more a website and its pages are linked from external sources, the faster a website and its pages will make it to the Google Index.

Publishing of the RSS feed

Publishing an RSS feed for a blog enables interested parties to subscribe to those RSS feeds and the more they get externally referenced.

Avoid Black Hat SEO

The initial few years saw many try to guess Google Search’s functionality and attempted to devise methods to fool the search engine. However, such black hat SEO techniques are not only very much discouraged by Google but they apply penalties to sites that are known to employ such techniques. Webmasters, therefore, should not only avoid any black hat SEO techniques but should also come to know of practices that Google mat construe as unacceptable.

Periodically check Google Search Console

Webmasters should periodically check whether site pages of their interest have been indexed by Google. This can be done by checking on the Google search engine or the Google Search Console.

Stay tuned on Google Search Updates

With rapid advancements in the area of search, it’s advisable to stay updated on best practices that are published by Google, Bing, Yahoo, or other search engines of concern. The important thing to note is that even if your site ranks higher today, it may not be at the same level the next day. Google indexing rules change constantly and your websmaster must stay up to date on any recent announcements or changes by Google.

google index improvement

 

— End

Back to the SEO Corner

For the past few years, page speed and load times have become a major factor in SEO rankings. And that’s for a good reason because slow page speed and load times seriously hamper a good user experience and can thus be an obstacle in a customer’s online journey. Google, therefore, uses this as one of the factors to determine site ranking.

Many studies available on this topic clearly show that a one second increase in page load times can have a major imapct for some online retailers. As reported on the Fastcompany.com’s website, Amazon estimates that a one page increase in page load times can cost it a whopping $1.6 Billion in sales each year. This loss is due to the users frustrations that in turn leads to cart and site abandonment.

— End

Back to the SEO corner

SERP stands for Search Engine Results Pages and SEO is the art and science of improving a site’s positioning within SERPs. SERPs are results that a user sees when they type in a query in a search engine like Google or Bing. The resulting page with results are called SERPs. An illustration of a Google SERP is provided below for illustrative purposes.

improving serp rankings

By looking at a sample SERP, one can see that it has two parts. The top part (parts of which sometimes appear on the sides as well) are the paid results, while the bottom part holds the organic search results. SEO (Search Engine Optimization) is about improving the organic search part and to have ones website rank higher in SERPs to enable a website to get more online traffic (organic or free traffic).

Although SEO is about ensuring a site’s higher placement in organic search results within the SERP, some of the guidelines help in a better placement in the paid results as well. With a lot of competition from advertisers to be shown on top of the paid results at the least possible cost, some of the SEO techniques (e.g. improving page quality, page speed, and so on) are used by Google to rank the paid results’ pages’ quality and uses that information accordingly to rank a site within the paid results.

— End

Back to the SEO corner

 

Formerly known as Google Insights, Google Trends is a tool that allows marketers to compare various search terms side by side based on their popularity over time. The data is gathered through Google’s search engine and reveals the percentage of Google web searches compared to all the searches during that time. The results can be filtered based on date, geographic region, or search type. For example, when searching for the word “happy new year”, it will show most of the searches a few days before the new year and the search tapers off in later parts of January. It is a very useful tool for those people who want to find the potential future viability, or the historic popularity of a search term or product. Search terms can be compared relative to each other as well.

— End

Back to the SEO corner

When Google bot visits your website, it does its best to crawl various parts of your website. However, depending on the technical setup of your site, the crawl may be hindered and it may not reach all parts of the website that you want the bot to reach and to index your site. There are many factors that can help increase the efficiency of the Google bot while its on your site. Ensure that you research the latest guidelines from Google in terms of increasing the crawl efficiency of the Google bot while it’s on your website.

— End

Back to the SEO corner

As images can dramatically enhance a user’s experience on a site, you should select the right images for your site. While it’s important to use images, you should also ensure not to go overboard, especially if they are going to have a negative influence on a user’s experience. When using images, ensure that you use the ‘alt tags’ and populate them with information related to the images. So, if an image doesn’t load properly, the user gets an idea on the image (through the text that is displayed instead of the image.) Also, ensure you make use of good filenames, titles, and captions when using images. Not only, these can help in improving your search engine rankings,  but they also contribute to a good user experience.

Image rules for SEO – Best Practices

So, in summary follow these rules for images:

  1. Pick an image that contributes to a good user experience
  2. Choose a friendly image name (preferably has the keywords)
  3. Include captions next to your images
  4. Use the ‘Alt-Text’ field to specify what’s in the image.
  5. Be careful about the image size. It should not impact the load times and page speed. Remember, long page load times can impact a page’s search rankings.

 

— End

Back to the SEO corner

Google Search Console, formerly known as Webmaster Tools can help you understand how Google interacts with your website. Google Search Console won’t help you improve your site’s credibility, but it can help you understand the underlying issues, which can affect a site’s SEO rank. It provides indexing status for sites and also point to issues and recommendations that can help you improve your search engine rankings.

— End

Back to the SEO corner

It’s a well known fact that linking to other sites or pages dilutes the site’s or page’s rank. Furthermore, if you are a site and link to spammy sites that can impact your site’s overall search engine rankings. This can, therefore, become an issue on blogs where if people comment on your blog and link to their sites. Too many links pointing out of your blog can dilute the overall rank of your own site. In those cases, you can include the ‘nofollow’ tag. This not only makes Google ignore any links going out of your site but also allows your commenters to include links that may be beneficial to your blog readers.

— End

Back to the SEO corner

DigiBizConsulting offers the best consulting advice on starting a new digital business and addressing all the areas of running a new digital business.