plan, execute, dominate
share

Category Archive: Technical SEO Issues

How to Recover If Your Site Is Penalized By Google

It’s a website owner’s nightmare: your page ranking suddenly drops without warning. There’s a good chance that you’ve been penalized by Google for something on your website. It might be a new SEO tactic or something you didn’t know existed in your code. Likely, it’s related to search engine algorithm changes (such as the Panda and Penguin algorithms), which are designed to increase the quality of Google’s search results.

recover-from-google-penalties

Recovering from penalties can take time, but it can happen. Here are a few steps to take if you’ve been hit with a penalty from Google:

Link Penalties

Google has made a point of punish websites that attempt to influence their site’s ranking or PageRank by “manipulating” incoming or outgoing links. But while finding “bad links” can seem simple, Google will also penalize sites for other, more seemingly innocuous actions.

For example, your site might be penalized if you are listed in too many directories that are considered to be “low quality.” Google might also punish your site if you are making guest posts on low-quality sites that are unrelated to your industry or use links back to your site excessively in guest blog posts. The simplest way to fix this problem is to remove the offending links. You can choose a sample of links from different directories to try to determine where your bad links are coming from.

Panda-Related Penalties

Panda is designed to root out poor quality links that were previously achieving high rankings because of reasons not related to their quality. Before you post any content, ask yourself if it answers an important question in a way that isn’t already covered online. If the answer is no, Google will likely treat your content as spam.

You also might have “doorway” links on your site with minimal text that are there to entice people to purchase products. If these pages are necessary, use 301 redirects or noindex tags to avoid Google penalties.

Manual Penalties

Google will also institute manual penalties for unethical on-site practices such as cloaking, shady redirects, hidden text and keyword-stuffed pages. Chances are that you know if you are doing this practices and you can rewrite content or code to fix the problem.

The Three Key Steps to Building Your Content Strategy

Unless you have a solid SEO content strategy, all of the content that you create for your website is just guesswork. The content might do an outstanding job of increasing your search engine rankings and driving customers to your site, or it might be completely useless. Developing a strategy that matches your overall online marketing goals is critical to making your content do real work for you and actually help you meet your SEO goals. It can also help you justify your SEO activities to other people in the organization without intimate knowledge of online marketing.

content-strategy

Three Parts of Every Great Content Strategy

1. Set your goals for your content campaign and the tactics you’ll use to reach them.

2. Show different types of content that are verifiably proven to be successful to use as models for future content development.

3. Convince others in the organization that content is about more than keywords and that it can serve as an education resource for customers.

 

Three Steps to Building Your Content Strategy

Step 1: Performing content inventory

The first step to take in building SEO content strategy is to do a content inventory of the items you currently have on your site. How long this takes will depend on the number of pages and items on your current site. However, it doesn’t need to be an arduous task. You can automate much of this process instead of having to manually hunt for content on a page-by-page basis.

Chances are that your site is build around a site map that breaks your pages out by categories. You can discover these content groups by using a site crawler like Screaming Frog. Once you do this, you can review your groups by applying some common sense. Ask if the groups share common expected visitor responses and are small enough that you can reasonably work with them.

Step 2: Gathering data

Once you’ve categorized your current content, you need to analyze how it is performing. There are many factors that are overly emphasized by some SEO experts such as total page views, time spent on a page and bounce rates. You’ll get a stronger sense about the performance and potential power of your content by looking at social factors such as:

  • Number of Facebook Likes, shares and comments
  • Posts about the content on other social media sites like LinkedIn and Twitter
  • Votes and +1s on social media sites such as Reddit and Google

You can also look at the pages themselves to see if it’s received reviews or comments from outside readers. Finally, you can use a range of analytical tools to track readers’ movements after reading a piece of content to see if they directly or indirectly lead to sales and conversions.

Language data is also an important piece of the puzzle. Track certain elements of each page on content such as:

  • Total words per page
  • Having paragraph, title and description tags
  • The reading level and ease of the content through standards such as Flesch-Kincaid
  • The overall usage of headings
  • OGP (Facebook) and Twitter markups

Step 3: Evaluating your conclusions

Take a hard look at the data you’ve collected. You can usually find some interesting trends based on your data that will inform your overall SEO content strategy. You might discover that certain keywords in the title generate a high number of social media shares. Or that the length of an article is related to how popular it is. Use these trends to come up with a clear content strategy that leverages these stats in order to make it easier for people without SEO knowledge to know why you are approaching content in a certain way.

Will Curated Content Help or Hurt Your SEO?

Website owners and operators typically believe that having more content on your site is better for SEO purposes, as it gives you more chances to use keywords and create valuable links. But this has changed recently as Google has instituted measures to make the quality of content as important and its quality and to punish websites that have spammy, low-quality content on their site. Having large amounts of low-quality content on your site actually serves as a negative ranking signal for Google and can lead to penalties with your search ranking.

content-curation-good-or-bad

An example of this is performing content curation on your website as an SEO tactic. This can mean creating pages with resources on a category related to your business or having a blog that posts links to relevant outside news articles. On the surface, this is a way to add content to your site and help your search ranking. But the truth is that it might not help you at all.

Google’s Matt Cutts recently answered a question relating to sites that curate content or post duplicate content. The gist was that only sites that are of the highest publishing quality will get a benefit from curating content. This means sites like the New York Times’ website that have a highly recognizable brand based on their content. It’s difficult to have something published in the New York Times and the content is curated by professional editors.

In fact, if you curate content using an algorithm or other automated process, Google might actually punish you. This is because this is essentially what Google does with their search engine results, only they likely do it better. Google sees automatically-generated curated content pages as inferior forms of competition and punishes them.

Most sites will fall somewhere in the middle of this spectrum. This means that while you might not get punished for having curated content on your site, you likely won’t see many benefits. This can change if you do some things to make your curated content stand out and create signals to Google about the quality of your pages. Getting commentary from experts in your field and using the rel=author tag will increase your credibility in the eyes of Google. If you can pull data from sources that are not available to Google or post content quicker than Google, you also will be seen as providing something of unique value through your curated content.

Don’t Forget About Robots.txt Files

Anyone with a basic understanding of search engine optimization knows that meta data such as meta descriptions, image alt text and title tags is critical to proper SEO success. But there’s one element that many people forget that can cause SEO campaigns to fail. Not properly implementing a Robots.txt file can make the difference between seeing your search engine rankings soar or sink.

robots.txt

Defining a Robots.txt File

Simply put, a Robots.txt file can help you tell search engines which directories on your site you don’t want them to index. The reasons for this are varied. You might want to make sure to keep sensitive information such as customer banking information on your eCommerce site secure. Or you might have proprietary information posted on certain sections of your website that you want to keep private. Having a Robots.txt file will tell Google, Bing and other legitimate search engines to not index these pages.

 

Robots.txt Keys for Implementation

Disable access to sensitive directories. This can potentially include directories such as: /cgi-bin/, /wp-admin/, /cart/ and /scripts/.

Remove all barriers to main content. This includes making sure that there are no “no follow” tags that will block searches.

Don’t let search engines index “duplicate” pages on your website. This can include sections of your website that are designed for regular viewing and printing, or content that is designed specifically for mobile sites. It’s better to only have them index the main content page in these cases.

 

Things to Avoid

Putting comments on your Robots.txt file

Listing all files in your Robots.txt. This actually makes it easier to find files you want to keep hidden.

Don’t use a /allow tag. This doesn’t exist in the Robots.txt file.

Four Steps to Investigate Plummeting Traffic After a Relaunch

Nothing is more frustrating than spending weeks or even months to redesign your website, only to see traffic plummet soon after the new version of the site goes live. It’s especially frustrating because you obviously did all the work because you expected to improve your search engine ranking, get more traffic and see more conversions. Where did it all go wrong?

Sinking Traffic After a Website Relaunch

In most cases, there is no single culprit; it’s usually a series of small mistakes that, on their own, would create minimal damage to your traffic but when taken together can be disastrous. Here are four of the most common problems we’ve found at eVisible when people try to relaunch their sites and their traffic tanks:

Check Google Analytics

It’s possible that your site’s analytic tracking didn’t automatically restart when your new site was launched. Manually check Google Analytics to make sure that it’s enabled and working properly; if it isn’t, check individual pages for issues like missing tracking code placement.

Recheck Google Analytics

You can also go deeper with your analytic research to see if there are unforeseen problems. Make sure that you have 301 redirects for any pages whose address structures have changed and thoroughly look for any 404 pages. Do a keyword search to see which keywords are underperforming and if they have been removed from poorly performing pages.

Review Your robots.txt File

It’s possible that your site has been deindexed for some reason. One possible explanation is with your robots.txt file. Check the head of page source code for a meta robots tag exclaiming noindex and also check for anything that says “disallow:/” in the code.

Check for Host or Server Issues

Changing your hosting or server can lead to communications issues that will make it impossible for search engines to index your site. There are several tools available for checking your DNS health; one thing you absolutely should do is look at your DNS errors and server connectivity in your Google or Bing Webmaster Tools.

20 Steps to Take Before Relaunching Your Site

Reworking your website to have a more SEO friendly design is a great way to boost your online business’ potential; it can also be intimidating, especially right before your new site goes live. Before you relaunch your site to the world, you need to go through a last-minute checklist to ensure that the launch will be a success. Here are 20 steps that you need to take when finalizing your professional website development:

website check points

Onsite Content:

1. Check that all of the content you’ve created – and the old content that you haven’t touched – is free of spelling errors, typos or other problems. Also make sure that the content is compelling for visitors.

2. Open your site in Google Chrome and press F12 and then open the “Network” tab to see your site’s speed and page sizes.

3. Test drive the forms on your site to make sure they work properly.

 

Web Development:

4. Verify that your site links are correct when they have transferred from the test site to your live site. If you have any 404 pages, develop a custom page with search to encourage visitors to go to other parts of your site. If pages are moved or have a new URL structure, make sure to use a 301 redirect.

5. Validate your W3C code and fix any errors you might have.

6. Add custom Favicons to your address bars or the tabs of users’ browsers to create a custom feel for your site.

7. Minify your site to compress the code and make it load faster.

 

Web Design

8. Cross-check your site on different browsers to ensure multi-browser compatibility.

9. Check that your display text is coming up when you hover over an image. You should also make sure that your images aren’t too large and that you don’t have issues with fonts not rendering correctly.

 

Search Engine Optimization

10. Make sure that all pages have title tags and meta tags.

11. Upload an accurate site map in both XML and HTML to make your site easier to navigate for users and search engines.

12. Have your Google Analytics package ready to go along with your Google and Bing Webmaster Tools. Also contact your ad rep if you have any PPC campaigns running to avoid a disruption.

13. Submit your site to the popular search engines to ensure they are indexed immediately.

14. Check to make sure that your SERP display is correct and your pages are displaying properly on search engine result pages.

15. Make sure that your social media buttons go to the right pages and allow people to “Like” the right parts of your site.

 

Network Administration:

16. Install site monitors to ensure visitors can get to your pages and use enhanced monitors for your key landing pages.

17. Run tests with load test software tools to simulate what will happen to your site in times of heavy traffic.

18. Have a backup system ready to go in case your site does crash.

19. Check password-protected pages to make sure people can’t get into them without the proper credentials.

20. If you have a secure certificate, check it to make sure it is valid on launch day.

Four SEO Myths That Will Ruin Your Online Marketing

SEO has changed the way that we look at online marketing. It’s led to companies getting smarter about how they develop their online marketing strategy and how they can incorporate many tactics to improve their search engine ranking. However, some people seem to think that SEO is a quick fix that eliminates the need for traditional online marketing.

SEO Myths

This couldn’t be further from the truth. While SEO is a powerful tool, it should only be one part of an integrated online marketing game plan. In fact, misunderstanding the role of SEO has led some people to develop warped ideas about online marketing. Here are four of the most common misconceptions about online marketing that people have developed thanks to SEO:

You aren’t in a battle against Google. It’s easy to think that Google has something against you personally if you are struggling to take your search engine placement to the first page. Much of this comes from the fact that there is an air of mystery behind Google’s algorithms used in search engine rankings. Because of this, many online marketers wildly vary their campaigns to reflect what they think are Google’s latest changes. This is almost always just guesswork. Instead, companies should focus on making the core SEO values that don’t change such as quality content and not focus all of their efforts on pleasing Google.

You need to invest time – and money – into any successful SEO campaign. Search engine optimization is a great way to maximize your marketing budget and see instant results. However, that doesn’t mean that it’s “free” marketing. You still need to invest resources into an SEO campaign. This can including spending money on a link building campaign or time in researching keywords and developing appropriate content.

You won’t make money without offering something of value. While it’s true that you can get a site to the top of Google search pages simply by buying links, spun content and a low-quality website, will this make you money? Eventually, people are going to see the quality (or lack) of your offerings. It’s always better to base your success on high quality content and products.

Real world testing is better than solely relying on expert advice. It’s never a bad idea to do research and read articles from experts explaining things that have worked for them. But there’s no substitute for doing your own research and learning through trial and error.

12 Non-Content Writing Steps to Improve Your SEO

Google has recently updated its newest algorithm. This has left many SEO companies and online marketers scrambling to determine how to best comply with Google Penguin requirements. However, attempting to pin down specifics about a Penguin update and craft content to fit might be missing the bigger point of effective search engine optimization: it’s about proper on-site SEO practices along with content.

So what are some of the steps that you can take to assure that your on-site SEO meets the best practices for Google?

Check for malware: You don’t engage in black-hat SEO, but if you have malware or someone running a rogue site through your site, the damage could be just as severe for your SEO reputation with Google.

Write short titles and description: Don’t have them go on too long or repeat information. Keep them short and don’t over-optimize tags.

Check your anchor text: Much like with title tags and descriptions, you want to avoid overusing your keywords or optimizing them too often.

Axe spammy user content: Comments on blogs posts are a good thing for organic search engine optimization; however, Google is now punishing sites that allow spam postings on sections of their site such as blog comment sections.

Increase page speed: A fast-loading site makes Google work faster; therefore, page speed optimization makes Google happy and it appears to improve a website’s SEO.

Cut useless optimization: Don’t bother over-optimization sections like headers and footers with the hope that “more is merrier.” It won’t help your ranking and in fact will likely lead to penalties.

Review your links: Penguin 2.0 has reduced the threshold of what they consider too many “spammy” inbound links on a site from 80 percent to 50 percent.

Check internal cross links: It’s easy to have links on one of your sites go to another site you own or partner with; check your links so you don’t get punished by Google.

Be smart with your alt image attributes: Put more of your content marketing services into alt image attributes than you are now: Google can give it as much weight as actual text.

Don’t overload ads: Having too many ads on your site – especially high – is frowned on by Google.

Add 301 redirects: Links that lead to dead ends tell Google that your site is not properly maintained.

Review crawl rates: Make sure that Google spiders are able to follow your site correctly.