“Nofollow” Stigma and the “Dofollow” Myth

Since the introduction of the “nofollow” attribute a stigma has developed that links from “nofollow” blogs are useless. As a result the myth of the “dofollow” Blog has emerged. Unfortunately there is a community of SEO experts that still believe “nofollow” links have no value when in fact the opposite is true. This article attempts to correct this assumption and discusses the benefits of having links acquired from “nofollow” blogs.

Occasionally you will find an authentic and perhaps even ethical “dofollow” blog that isn’t littered with ads. The problem is you are not the first to find it. In fact several hundred people have already discovered this blog before you and have already taken full advantage of the fact it is “dofollow”. As a result most of the posts are already saturated with dozens of non-related links. Any PageRank associated is divided by the total number of outbound links giving each individual link almost no link juice. Even a blog with a high PR value when saturated with a copious amount of links will result in each link having little if any value.

Another challenge of “dofollow” blog links is that they are usually not relevant to the theme of your site and most search engines view these types of blogs as a “link pages” which further reduces their value. If a site has numerous non related links it raises a red flag for most search engines. Search engines can easily identify link pages and a link from such sources will do very little for your site except raise suspicion.

SEO experts know that a healthy link profile is an essential component of optimization. “Dofollow” blogs risk damaging your sites reputation and taint your link profile. A better approach is to invest your time posting comments on relevant blogs that are industry specific. PageRank might not be transferred through these links but they will still help build a healthy link profile. An example of this is an SEO expert that relies heavily on directory submission to acquire inbound links. Links acquired from one source appear inorganic. Blog commenting is a way of increasing link diversity helping optimized sites appear more organic.

There are no shortcuts in the SEO world. SEO experts that search for loopholes such as the “dofollow” myth find themselves wasting valuable time that could have been invested building an organic link portfolio. Links from “nofollow” blogs are an essential component of an organic link profile. Links form “dofollow” blogs should be avoided as they are usually of poor quality and come from dodgy websites.

Questions to Ask When Hiring an SEO Consultant

There is no governing body that accredits an SEO specialist and almost anyone can claim to be an expert. As a result it is important to do some research and learn what type of questions you should ask in order to determine if an SEO consultant knows what they are doing. This article outlines a few important questions that should be asked when hiring an SEO consultant.

The first question to ask any SEO consultant is if they have examples of work they have done and what placements they have achieved. Top placement can easily be achieved for non-competitive keywords that don’t actually drive quality traffic to your site. A common trick used by some unethical SEO experts is to optimize a site for keywords that don’t drive quality traffic. Sometimes site owners become so fixated with being on the first page of Google they forget to check if people are actually searching for the keywords. A professional SEO consultant should provide you with a thorough keyword analysis including monthly search volume, competitiveness, and a list of other top companies that are currently targeting these keywords. First page placement means nothing unless you place for quality keywords.

Another important question to ask when hiring an SEO consultant is if a monthly maintenance package is required to maintain results. Once a site is placed on the first page of search results for desired keywords little maintenance is require to keep it unless you are in a highly competitive market. If a monthly maintenance package is required be sure to ask your consultant why they feel it is necessary. Unfortunately a common practice of some unethical SEO experts is to place a customer’s site on the first page of search results and then later remove some of the acquired links causing a drop in placement. As a result the client is tricked into thinking a monthly maintenance package is required. To avoid this from happening be sure that your consultant can explain why their monthly maintenance package is required.

The last question you should ask involves ethical optimization practices. It is important to ask your consultant if they perform ethical optimization and if they have a definition of what ethical optimization is. Although there are no legal authorities that govern search engine optimization it is important that your SEO consultant practises ethical optimization. There are many hidden rules in the industry that when violated can result in penalties ranging form a loss in placement to being removed entirely form index. Some of these practices include back hate techniques such as keyword stuffing, cloaking, invisible text, reciprocal link exchanges (i.e. link farms), or buying/selling links. If your SEO consultant feels that they are smart enough to use these techniques and get away with it then find another consultant. Having your site blacklisted isn’t worth the risk.

There are many things to consider when hiring an SEO expert. Best advice is to do some research and talk to a few different consultants before making any commitment. Be sure to ask your consultant if they have samples of work they can show you including keyword search volume and competitiveness. Ask if your site will require a monthly maintenance package and why they feel it is necessary. Lastly, be sure to ask if your consultant practices ethical optimization. You will find that most professional SEO consultants will expect these questions and take no offence to you asking. If a consultant reacts badly to these questions or can’t provide a respectable answer then it is best to look for another consultant.

How Your Site Appears to a Search Engine

n order to successfully optimize a website it is important to have a basic understanding of how search engines work. Without this basic understanding any attempt to optimize your site could be counterproductive. This article provides an overview of how search engines work and see the pages of your site.

Search engines find websites using special software referred to as spiders (search bots). These spiders crawl the internet moving from website to website using links as channels that take them from page to page. The more inbound links your site has the more frequently it is crawled. If any changes have been made to your site the spider will take notice and collect this information.

It is important to know that spiders primarily read the content of a site and that mages are transparent. The only information a spider can collect about an image is from the alt attribute associated with the img tag and surrounding text. For example, if you are trying to optimize your site for the keyword “Toronto SEO Company” and have an image of your company and team members you would perhaps include an alt attribute with they keyword embedded such as alt=”seo company Toronto” and a title attribute such as title=”Affordable SEO Services in Toronto” in the img tag. In addition, you may also want to include a caption below the image such as “Our SEO Team in Toronto”. This also provides a good opportunity to increase the keyword density of the page as long as you don’t unnaturally stuff keywords.

Spiders crawl the web from using hyperlinks so it is important to know that sites or pages within a site that require passwords can not be traversed by spiders. As a result these pages won’t be indexed or contribute to the overall PageRank of your site. As spiders traverse links they also collect information about these links. It is important to ensure that appropriate anchor text is used and it is relevant to the pages linked. If a link includes the anchor text “Toronto Adwords Management” it should link to pages within your site that contains relevant information about Adwords management in Toronto. If a spider traverses a link within your site and discovers the pages don’t match the anchor text and no common theme is presented then it will confuse spiders. If this happens a spider may not finish crawling your site and regard it as a dodgy.

The easiest way to know how search engines view your site is with an All-Text Browser. This type of browser only displays text and links and can help you gain an understanding of how search engines see your site so that you can make improvements. A text-only browser will ignore images and graphics and only process the text and links found on a page similar to a spider. Lynx Viewer is an example of an All-Text Browser and is available for free at http://www.delorie.com/web/lynxview.html. In order to use it you will have to create a htm file called delorie.htm and upload it to your root directory. This file is required to confirm that you are the actual owner of the site. Once this is completed all you have to do is simply enter your URL and see what your site looks like in a text-only browser.

Knowing how search engines work can help a web designer or SEO specialist strategically build a site that is SEO friendly. A search engine can’t see images and can only read the text on a site. Websites that use passwords to access pages with in a site stop spiders from crawling these links and prevents them from being indexed. When linking pages within your site be careful to include relevant anchor text that matches the content of the linked pages. In order to better understand how a search engine sees your site you can use an All-Text Browser which only displays the text and links found within your site.

Design your Site with SEO in Mind

There are many design factors that can greatly affect your websites placement on search results. Unfortunately many designers work against there own best interest by unknowingly using methods that decrease a site’s ranking on search engines. Most often SEO experts end up having to correct these design errors in order to achieve top search engine placement. This article outlines a few design errors that should be avoided when creating a website with SEO in mind.

The first mistake made by some designers is the use of splash pages. Splash pages are those introductory pages that appear before users can advance to your site. The problem with splash pages is that search bots don’t like them for a number of reasons. One reason is that most splash pages don’t contain the types of internal links found on the site actual homepage. Splash pages are also sometimes designed to automatically advance the user to the sites homepage automatically after a certain period of time. The problem with this is that a search bot doesn’t know to stick around and wait. If a search bot is left waiting for too long it search bots will give up and move on to the next page in its queue. Another problem is that splash pages are not designed for text content and contain flash animations or graphics which are transparent to search bots. The first page crawl by a search bot should contain relevant content and links to every other page of your site.

If you understand how search bots work then you will know that they can read text and some html but can’t read images, videos, or flash animations. Many designers build media rich websites full of images and animations believing there artistic ability will generate a better quality website. Although the aesthetics of the website might be outstanding if the site isn’t SEO friendly and can’t be found in search results it won’t do much good. Websites built entirely with Flash are perfect examples of this and should be avoided. The website might appear to be relevant to the user and prove useful but to a search engine it appears empty. As a consequence search engine will have no idea what the site is actually about and won’t place it in search results.

Many designers often use graphics for page headers in order to add artistic value or use fancy fonts not available in text form. Header tags are a vital component of your website as it describes the theme of each page within your site. An analogy of this is textbook that didn’t contain any headings. Each chapter or section would run into the next and at a quick glace you would have no idea of what each chapter is about. If you use images for your headers tags this is exactly how a search engine will see your site. For SEO purposes best practice is to use text for all header tags.

Weighing down a site with too much complex code can also negatively impact your site ability to place high in search results. Search bots can read some html but complex code such as JavaScript is often ignored. As a result links within a script might not be indexed. This is particularly troublesome if JavaScript is used for your sites navigation system. It is best to keep the amount of JavaScript used down to a minimum. If JavaScript is used as a part of your navigation system be sure to include text links to each page of your site somewhere else on the page where search bots can crawl them. These text links are usually placed at the bottom of a site alongside so they don’t upset the aesthetics of the site. A second problem occurs when your content is buried under hundreds of lines of code or nested within tables. If a search bot has to crawl through hundreds of lines of code in order to find text chances are it will give up before crawling the entire page and move on to the next site in queue. A good rule of thumb is to have more text then code.

There are many design factors that can hinder a websites ability to place in search results. In order to build an SEO friendly site it is important to avoid using splash pages, excessive use of graphics and animations, use text in header tags, and avoid using too much complex code. A website designed with SEO is much easier for search bots to crawl and is easier to place on the first page of search results. Placing a website that isn’t favoured by search engines on the first page of search results is almost impossible. As a result most web designers often need to outsource website optimization to experienced SEO experts who have a better understanding of how search engines work.

The Importance of Checking “C” Class IPs and Link Building

Link building is one of the most important aspects of search engine optimization. The more links a site has the better. Ethical link building takes time and unfortunately many have tire to find loopholes in the system. Some SEO specialists in the past have learned that they can cheat the system by building multiple websites and using them to link to the websites they want to place within search results. As a mechanism to defend against this search engines have updated their algorithms to check for duplicate C class IPs of inbound links. This article explains what C class IP addresses are, why you should avoid crossing links from duplicate C classes, and how to check the C class of inbound and outbound links.

An IP address is a unique number that identifies a computer on the internet. The four classes that make up an IP address are represented as AAA.BBB.CCC.DDD. Similar to computers your hosting account is assigned an IP address. Every domain hosted within the account will share the same IP address. As a result you should avoid cross linking any sites hosted within the same account. Even if the domains are registered under different names the fact that they share the same IP address tells search bots these sites are from the same neighbourhood. Sites that are linked like this look suspicious to search engines and the links will either be ignored or both sites can be penalized. If you have multiple sites and want to link them together you should assign the rel=”nofollow” attribute or host them on accounts with different class C IP addresses.

Even if you are the owner of a single site it is still important to avoid duplicate C class IPs of all inbound and outbound links. Each time you link to a site you will want to check its IP address to ensure the C class is different from yours. If your C class matches the person you are linking to and you link back it will look suspicious to search engines. It is also important to check the IPs of websites that link to you. For example, some directory owners have multiple websites hosted on the same account and you may not be aware of this. Submission to these directories can end up being a huge waste of time. Search engines know that these sites come from the same source and may only the links form one of these sites if any to count. Acquiring links from a variety of sources is important for optimization and the worst thing you want to do is taint your link portfolio by adding multiple links from duplicate C class IPs.

Fortunately checking for duplicate C class IPs address is easy if you have the right tool. The tool I like to use is called a class C IP checker and one can be found here at www.ip-report.com. This tool allows you to load a text file with all the IPs you wish to check. This feature saves you time as you continue to build your list of domains without having to rebuild the list from scratch.

Search engines have advanced their algorithms in order to prevent unethical link building practices. One precaution being taken is to check the C class of IPs to ensure that links are form different sources. All website owners should check the IPs of inbound and outbound links to in order to build a healthily link portfolio. Using a C class IP Checker is perhaps the easiest way to ensure all your links come form different sources.

No One Likes a Copycat Especially Search Engines

Content writing is one of the most challenging aspects of building a well optimized site. Most often SEO experts have to outsource this component to professional content writers. Unfortunately outsourcing costs money and sometimes unethical developers steal your content and claim it as their own. Duplicate content is frowned upon by search engines and can result in both parties being penalized. This article discusses how you can avoid your content from being plagiarised and how to avoid duplicate content using Copyscape.

Copyscape is an online plagiarism service for website owners that want to protect their content from being stolen. If your site contains duplicate content it will be penalized by search engines. No one can be certain the extent to which a site containing duplicate content will be penalized but one could assume that the longer duplicate content is present on your site the greater the penalty would be. Search engines function primarily to provide original content to users. If search engines didn’t penalize for duplicate content we could end up in a situation in which a search query is performed and the results generated yield multiple website with the same content. This would be extremely annoying and result in a poor user experience.

Copyscape can also help prevent content writers from accidently stealing small pieces of content from the sources they reference. Content writers don’t possess enough industry knowledge to write content without having to do some research and referencing different sources. In the process content writers sometimes borrow thoughts and ideas originate from someone else. In doing so they must be careful to only extrapolate concepts and translate them into their own words. Using Copyscape to check the content they write can help them prevent duplicate content.

No one likes a copycat especially search engines. Fortunately there are services we can use to prevent plagiarism from occurring. Copyscape can be used to check if your content has been plagiarised. It is also a useful tool that prevents content writers from accidently duplicating content. Copyscape can also be used to prevent plagiarism and provides free warning badges that can be placed on your site to deter someone from stealing your content.

Drive Traffic to Your Site with Meta Descriptions

As you comb through search results you will find a variety of site descriptions within the organic listings. Without these description users wouldn’t know which websites to select. The contents of a meta description tag contain important information about the contents of a page and is displayed when users perform search queries. Without this meta description tag search engines will have no choice but to assemble a description from snippets of text found within the content of a page. This description is often mechanical and doesn’t reflect the true purpose of a page. This article discusses why meta descriptions are important for branding and a few tips for SEO experts to consider when creating these tags.

The meta description of a page is an important aspect of search marketing. This tag contains a description of individual pages within your site and is displayed to users when they perform search query. These descriptions do not directory improve website optimization but does provide an opportunity to brand your business. Meta descriptions can be used to display important information that is beneficial for users. As an example, the meta description for the SEO services page within my site is displayed within Google search results as:

I chose to include my phone number within the description to encourage those interested in my services to call. I also include a very brief 2 sentence description of the services I provide.

You will notice that within the above description that certain keywords are bold. The results displayed were based on the search query I performed using the keyword “Calgary seo services”. As a result Google displays the keywords “Calgary”, “SEO Services”, and “Services” in bold to improve user experience. It is important to include the keywords your site is optimized for within your meta description. However, this has been abused in the past by inexperienced SEO experts stuffing the description with numerous keywords. The end result is was description that looked totally inorganic and decreased user experience. Search engines are on the lookout for keyword stuffing and will penalize anyone site that looks suspicious.

You will also notice that not all of the description is displayed within search results and some of it is cut off. The length of the description is important. For example, if a description is less then 50 characters Google might not display it and assemble its own description from snippets of content from the page. To avoid this be sure your description contains at least 1 to 3 well structured sentences.

A common mistake made by inexperienced SEO experts is to include the same meta description for each page within a site. Each page of your site contains different information and therefore should have a unique meta description. A well optimized site will have unique meta data for each page within a site that matches the content.

Search engines are about relevance and if the content of your site doesn’t match the meta description it will confuse search engines. If your description is entirely different from the content of your site it will be perceived as deceptive and result in a penalty.

A well written meta description provides an opportunity to brand your business and draw user attention. A meta description should consists of 1 to 3 sentences that describe the contents of your site and be unique to each page. This description must be relevant to the content of your site and the excessive use of keywords should be avoided. An enticing meta description can distinguish your business form the competition and can help drive organic traffic aware from your competitors and to your website.

Use Split Testing to Create Killer Ad Copy

One of the hardest components of managing a pay per click ad campaign is writing compelling ad copy. The limited number of characters and the fact that quality score is based on relevance makes this challenge even greater. One way to overcome this obstacle is thought split testing yourself or hiring a PPC Agency to do this for you. This article explains what split testing is and how it is used by PPC managers to create killer ad copy.

Split testing involves running multiple ads within the same ad group simultaneously. The idea is that the ads compete against each other over a set amount of time. The metric used to determine success is click through rate (CTR). This metric is important as CTR is an important factor in quality score which directly affects ad rank and bid price. The higher the CTR the less you will have to spend to maintain the same position within search results. A high CTR also indicates that your ad appeals to users and is drawing clicks aware from your competitors.

After a set amount of time the ad with the lowest click through rate is deleted and a new ad is written in its place. The new ad doesn’t have to be entirely different. In fact the opposite is true. If you already have a successful ad then you may only want to make small changes to the ad copy. Sometimes small modifications can make surprisingly huge differences.

This process can be repeated with no end. Some managers spend as much as two years optimizing a campaign with this technique. If you continue to see results and beat the previous ad then best advice is to continue in that direction. However, if you reach a plateau where you can no longer beat the current ad then you may have reached your zenith. However, the world of search marketing is constantly changing so it is beneficial to still write a new ad once in awhile.

For those new to PPC advertising it is important to know that two ads can not be shown at the same time. With Adwords you have the option to have your ads rotate evenly or optimize. If you select rotate evenly each ad will be shown approximately the same amount of time regardless which ad is more successful. Most often it is a better choice to select optimize. This option displays the better performing ad more often. This option is probably best for PPC managers that set up a split test and don’t plan on checking the results for a week or two. Microsoft adCenter doesn’t offer any options in regards to ad rotation and will automatically optimize based on click through rate.

Split testing is a powerful tool used by PPC managers to optimize ad performance, increase CTR, lower costs, and helps PPC managers write compelling ads that draw clicks away form competitors. Both Adwords and adCenter allow for split testing and will rotate competing ads. The ad with the higher CTR is kept and the losing ad is replaced with new ad creative. This process is repeated until a plateau is reached in which it is safe to assume you have created killer ad copy.

Creating a Healthy Inbound Link Portfolio

A healthy inbound link portfolio is an essential aspect of SEO. Some links are extremely beneficial and can increase the trustrank of your site with search engines. However, there are some links that are of little value and can even taint your link portfolio. This article provides an overview of link building and how to maintain a healthy link portfolio.

Links from Multiple Sources

A healthy link portfolio consists of links from a variety of sources and domain extensions. It is important to acquire links from a variety of source types including directories, blogs, press releases, article submissions, and etc. For example, if all of your links are from directories the full strength of these links won’t be recognized as it will appears unnatural to search engines. Adding a few links from other sources would greatly benefit this type of portfolio and increase the strength of the directory links.

For example, lets compare the link portfolio of two imaginary sites A and B. Site A has 1,000 inbound links from six sources and site B has 100 inbound links from sixty sources. Site B will be recognized as an authority within its industry and all of the inbound links will be valued by search engines. Site A would appear spammy to search engines and only a few of the links from each source would be recognized. It is more important to add links from a variety of sources then to have multiple links form the same source.

Domain extension is another factor that can help add variety to your link portfolio. If I live in a certain country such as Canada you would expect the bulk of my links to come from domains with a .ca or .com extension. To add more variety to this portfolio links from domains with .info, .org, .edu, .net, or .gov extensions should be acquired. Links from .gov and .edu domain extensions are believed to contribute greatly towards search engine trustrank. The registration of these domains are restricted and as a result most SEO experts believe these links Carry Special Value.

Rate of Link Acquisition

If inbound links are acquired too rapidly it can appear unnatural to search engines and result in a penalty. This is a common mistake made by inexperienced SEO experts that become too ambitious. If a site doesn’t normally acquire links on a regular basis and then all of a sudden begins acquiring 5 links a day it raises a red flag for search engines and can result in a penalty.

In contrast, if link building comes to an abrupt halt it can also raise a red flag. For example, if numerous inbound links are applied to your website over a short period of time followed by prolonged period of inactivity it too can look suspicious to search engines. Best advice is provide some type of ongoing maintenance in which and a few links are regressively acquired.


Perhaps one of the most important link building factors is site relevance. If you acquire links from sites outside of your industry they will hold very little weight. In fact, links from sites totally unrelated to yours appears inorganic and weakens your link portfolio. Links from sites that are industry specific are as good as gold. These links will not only increase the strength of your link portfolio but will also drive quality traffic to your website. Even if these links are from nofollow blogs they can still help steer the overall theme of your link profile towards a direction of relevance. These nofollow links won’t pass along link juice but can help your profile appear more organic.


The PageRank (PR) of the site that links to yours is another important factor. However, it is more important to pay attention to the total number of outbound links these sites have as opposed to their page rank. A link from a PR6 website with 1,000 outbound links contributes very little link juice. In fact you would be better off with a link from a PR2 websites with only 10 outbound links. Most often acquiring links takes much effort and you will gladly accept links from any site regardless of its PageRank or the number of outbound links it has. However, it is a factor to consider if you have to choose between linking sites.

Link Value Increases with Age

The last factor I would like to discuss involves site and link age. It is a well known fact that the age of a domain plays a critical factor in trustrank. Therefore the age of the site linking to yours is very important to consider. Links from sites that have been in existence for several years are more trusted by search engines compared to new sites. Also, the length of time a link has been in place is also important. A link that aged for several months will have more strength then a newly acquired link.

How to Avoid SEO Schemes

As most business owners don’t understand the technical side of search engine optimization they are victim to SEO schemes. Although most SEO consultants practice ethical search engine optimization there are some in the industry that are dishonest. Like many other industries in today’s society it is up to the consumer to do some research in order to avoid being taken advantage of. The purpose of this article is to educate consumers of common SEO schemes used in the industry by some SEO consultants that lack ethics.

One of the most common SEO schemes involves providing inbound links from sites owned by the SEO company. The problem is that instead of investing time and money to build links to your site the company actually optimizes sites that they own and provide backlinks from these sites. This technique increases the dependency of your sites rankings on links provided by them. Once you stop paying your monthly maintenance fees these links are usually removed and within a month or two your placement within search engines begins to decrease. In order to maintain your placement within search results you are locked into paying a monthly maintenance fees.

Another common scheme is for an SEO consultant to use your website to optimize their own. Many SEO companies like to stick a link at the bottom of your site that crediting their business and providing a link back to their company website. If a link is required to acknowledge that a certain SEO company optimized your website be sure to inquire if this link is assigned the nofollow attribute. This attribute allows user to transcend the link but doesn’t allow any PageRank to be transferred from your site to theirs. If you hire an SEO company to optimize your site be sure to ask if they require such a link and have them explain why they feel this is essential.

It is also important to be on the watch for SEO companies that will use your website to optimize their client’s websites. Link pages can be a valuable resource if the links they contain actually benefit the user and are industry specific. Unfortunately some SEO companies use link pages to link to their clients. They lead the owner of each business into believing they are partaking in an authentic reciprocal link exchange campaign when in fact all they are really doing is linking customer websites together. There are two major problems with this type of SEO strategy.

The first problem is that once again the SEO company as authority to remove these links once you are no longer paying monthly maintenance fees. If these reciprocal links are removed all you are left with is a link page that has several outbound links with no incoming links. The end result is a link page with a net loss of PageRank. To prevent this certain links from this page will have to be removed and often this page will have to be removed entirely. If the owner of this website doesn’t possess a basic understanding of html this task will have to be outsourced and will cost additional money.

The second problem is that most link pages should contain links to useful resources that benefit the user. Unfortunately many SEO consultants forget about user experience and focus strictly on optimization. As a result most link pages end up being a random collection of unrelated links that have nothing to do with your industry. Links like this have very little value and often raise a red flag for search engines signalling that inorganic optimization is taking place.

To avoid this be sure to ask your consultant about their linking strategy and if they can provide samples of link sources. You can also print a monthly link reports using one of several online backlink checkers. Be sure that the links acquired are from sources outside of the SEO companies control.

There are many SEO schemes like the ones I mention in this article. Best advice is to do some research and educate yourself about basic SEO principles before hiring an SEO consultant. Reading this article should help you to identify some of the more common SEO schemes. When inquiring about the SEO services a company provides be sure to specifically ask about their link building strategies, if they require a link to be placed at the bottom of the pages within your site to give credit, and if they use or recommend link pages. Like many other industries in today’s society education is the only sure way to avoid SEO schemes.