n this chapter, I’ll answer some common questions about Advanced SEO.

Can 301 redirects hurt my search rankings?

For those not familiar with what 301 redirects are please read the following article to familiarize yourself with the topic: https://support.google.com/webmasters/answer/93633?hl=en

The short answer to this question is no, 301 redirects will not hurt your search rankings. In fact, they are Google’s own recommended best practice for redirecting old pages and for moving domains. All of your links, page value and search rankings should pass through the redirect.

That said, there are a few issues that can sometimes arise when using 301’s.

Delay in set-up

The first and most common issue is a delay in the time that you set up the 301 and the time the redirect is recognized and updated by Google. Sometimes this process is quick and sometimes it’s not, and when it’s not you can potentially see a loss in search rankings. If this happens to you there’s nothing to worry about because it’s only a matter of time before your rankings are restored.

Existing penalties

Another issue can arise if you’re setting up a 301 in order to escape from a penalty placed on your page by Google. It’s important to understand that a 301 will pass along a penalty just as it will pass link value. If you have picked up a penalty it is best to keep that page unassociated with your other sites, at least until you can have the penalty lifted.

Too many redirects

Finally, you can experience a loss in rankings if you setup too many 301 redirects. Google does its best to pass all of your old value through to your new pages. However, it is possible to lose small amounts of value through redirects. This is normally not a problem when redirecting one page to another since any loss of value is typically too small to affect your overall rankings. But if you setup a lot of redirects then the small loss of value though each is multiplied. One of the more common reasons behind the use of too many redirects is when webmasters use them as a quick fix for broken links, or when webmasters create a 301 chain rather than fixing the fundamental problem.

Are hyphens in URLs okay?

The question of whether or not hyphens in URLs are bad for SEO comes up fairly often. The question often arises when shopping around for a new domain name or when deciding on how to structure the directory and page names of a new site.

I’ve talked to a lot of SEO’s about this over the years and the general consensus seems to be that hyphens do not hurt, in fact they can actually help in a number of ways. But of course, the answer isn’t black and white and there are consideration to take into account.

For example, hyphens in your directory and page names are perfectly fine. They make your URLs easier for people to read, such as when people are reading search result listings, and they help search engine bots to properly parse the file names.

Hyphens make URLs easier to read

Search engines typically display your URL as part of your listing on search results pages. A keyword rich URL can give people an idea of what they can expect to see when they click on your link. Using hyphens makes reading these URLs, and spotting valuable keywords, very easy and because of this can have a positive impact on your click through rates.

Hyphens help bots to parse file names correctly

Using hyphens as word separators in URLs is primarily beneficial for surfers. But it might also help search engines to see the same keywords that you do. A great example that’s always floated around on SEO websites is “expertsexchange.htm”. By hyphenating the URL you might help search engine bots to see “experts- exchange.htm” rather than “expert-sex-change.htm”. Of course, whether or not search engine bots even need our help is debatable, but at least I finally got a reason to use this great example!

Hyphens in domain names

Whether or not to register domains with hyphens is an age old concern. I’ve personally had hundreds of domains with hyphens that ranked just fine, and you can Google many popular keyword phrases and see that other hyphenated domains rank well also. I would recommend against registered domains with too many hyphens though. I personally like to keep domains at two hyphens max, to prevent having the domain devalued for being potentially labeled as spammy.

Does running large blog networks still work?

One strategy that’s worked very well for a lot of Internet marketers is setting up and running a large network of blogs, often consisting of well over 50 individual blogs. Some webmasters set up their own WPMU networks so that all of their blogs resided on subdomains, others set up their blogs on hosted blog services to take advantage of the inherited domain authority of the hosted domain, and others set up all blogs on their own root domains. Each of these strategies have their own unique pros and cons, but I know of many webmasters who were able to make a business out of each one.

Whichever strategy was used, the basic concept was to build out each blog to target a different niche or a different product, fill them with some base posts and then interlink them all and se tup link trades with as many outside blogs as possible. Traffic would come from some of the link trades as well as from organic Google listings. And the Google listings would come because of the targeted domain/subdomain, the targeted content and most importantly the link value passed through all of the link trades.

Devalued by Penguin and Panda algorithms

This strategy however was significantly devalued when Google launched its Penguin algorithm. Penguin targeted sites that had the majority of their links coming from low value sites and that had their target keyword phrase in the majority of their anchors. This of course is a near perfect description of the typical link trade. And in addition to Penguin, Google updated its Panda algorithm which targeted the content displayed on a website, specifically intended to improve detection of affiliate websites. Google doesn’t feel that affiliate websites are of great value to its users and tend to prioritize sites without affiliate links and content.

So because most large blog networks were held up by low value trades, a relatively low quantity of base posts and were filled from top to bottom with affiliate links, they were affected by both Penguin and Panda. Some blog networks were set up on their own domains that matched their target keyword phrase exactly, which would have been affected by Google’s EMD algorithm (Exact Match Domain) update as well. This update basically removed the artificial value that used to be placed on exact match domains.

These algorithm updates effectively eliminated the primary strategies that large blog networks were based on. Many of the affiliate bloggers I know had to attempt to regain their lost rankings by removing low value trades and replacing them with higher value backlinks. In addition to this, many tried to improve the quality of their content as well as reduce the overall number of outbound links, especially those that contained affiliate ID’s.

So, do large blog networks like this still work? They can yes, but the old strategies that many webmasters have used for years need to be abandoned in favor of newer strategies that are better aligned with Google’s goal of only ranking the highest quality sites that provide the most value to its users.

How come my site doesn’t rank well anymore?

A drop in ranks can be due to a number of things, and the only way to really find out which one was responsible for your drop is to research each and every potential cause. When a client comes to me with this question I like to use this list first which covers most of the more common causes for a drop in rank. 90% of all websites I’ve personally analyzed lost rank due to one or more factors on this list.

Page Optimization

  • Content is thin
  • Content is duplicated
  • Content is poorly written or spun
  • Content is stuffed with keywords and/or links
  • Content is poorly organized
  • Page navigation is poorly structured
  • Too many outbound links
  • Too many outbound links containing affiliate ID’s
  • Outbound links point to devalued sites
  • Duplicate title meta tags
  • Duplicate description meta tags
  • Presence of keyword meta tags
  • Too many broken links
  • Too many redirections
  • Canonicalization problems
  • Robots.txt problems

Linking

  • Detected as a link trader
  • Detected as a link buyer
  • Too many links from other sites detected as link traders or buyers
  • Too many low value links
  • Too many site wide links (sidebars and footers)
  • Too many anchors containing the same keywords
  • Links from penalized pages
  • Unnatural link profiles
  • Loss of high value links
  • Devaluation of inbound links
  • Too few links pointing internal pages

Other

  • Natural ranking boost removed
  • Malware detection
  • Slow loading pages
  • Manual or automatic penalties
  • Detection of blackhat SEO tactics
  • Natural rank shuffling
  • Rankings replaced by competitors

How important are meta tags these days?

Meta tags as many of us know them have essentially all been devalued or ignored by all major search engines. Meta tags were developed and were useful back when search engines lacked the software and hardware sophistication they now have, back when they relied on us to tell them what our page was about and how we should be included in their index.

Search engines of course no longer need us to help them determine how we should rank and for which keywords. And they don’t need us to tell them how often they should crawl our sites, who our intended audience is, what the rating of our content should be or whether or not they should cache our pages.

There are three meta tags, however, that are still important factors for your rankings.

Title tag

Search engines consider your title tag as the primary indicator of what keywords you intend to rank for, and they use it as the link they display on their search results pages. This makes it very important because it not only has to be written in a way that helps Google rank your site, but it has to be written to attract attention and to encourage clicks from search results pages. Title tags should be unique on every page.

Description tag

The description tag is sometimes used as the snippet of content that search engines display under your link on their search results pages. So because of this it should be used to reinforce your target keywords and to provide people with an interesting snippet of content to hopefully spark enough of an interest to click on your link. And like Title tags, Description tags should be unique to every page.

Robots tag

Robots meta tags can be used to tell search engine bots which pages should not be indexed, meaning pages you do not want ranked. Pages you don’t want ranked might include login pages and customer support pages. You can use your robots.txt file to tell them not to index these pages, but the problem with this method is that if you have any link value passing through these pages and onto others within your site, that link value will be completely blocked and you will lose it. If this is the case, if you do have link value passing through pages you don’t want ranked, then the robots meta tag “noindex,follow” will prevent the page from being ranked, but will allow all link value to pass through your other pages.

What’s the best SEO strategy for improving conversion rates?

The keywords you’re targeting will essentially determine your conversion rates. Search traffic in general is very high quality, however you have to target keywords that match the content on your site as closely as possible.

To determine if you’re targeting the most relevant keyword phrase for your content you really only have to do a couple of things; review the currently ranked top ten sites and check your Google Analytics account. Reviewing which sites are currently ranking in the top ten is something you should do every few months, since some phrases tend to change focus over time. As for reviewing Analytics, you want to pay attention to three important metrics:

1. Bounce Rate

2. Visit Duration

3. Pages per Visit

Bounce Rate

This metric tells you how many people left your site after viewing only the home page; it’s an indicator of the level of interest people have for your site. The concept is similar to raws and uniques, the better the ratio of raws to uniques the better the traffic quality. So if you have a high bounce rate (from organic search traffic) you know that you’re not targeting the best keywords.

Visit Duration

This metric tells you how long people stay on your site. If the majority of your search traffic leaves in less than a minute than you know you have a problem somewhere.

Pages per Visit

This metric measures how many pages people click through during their visit. It’s like an extended view of typical unique traffic stats.

These metrics combine to provide evidence of surfer interest in your site and in your content. If you can get all three working for you then your conversion rates are sure to improve.