Google Notes of Note: Title Attribute Abuse

May 16, 2012 By Dan Thies 11 Comments

An interesting note from Barry Schwartz over at Search Engine Roundtable today… Barry spotted a thread over on the Google Webmaster Help forums, where Googler John Mueller (aka JohnMu) noted that a particular site had approximately 3 metric tons of text stuffed into title attribute of links, and said:

To our algorithms, that might look a bit sneaky — and in practice, it doesn’t make that much sense, so I’d recommend going through your pages and making sure that you’re using title-attributes as they would normally be used.

In case you have no idea what I am talking about – the title attribute is used on images and links to provide a “tool tip” when you mouse over it. So a sensible title attribute for a link might say “click to sign up” but there’s not much point in even doing that, if the link already says the same thing.

Anyway – this is the first time I’ve heard this specific issue raised in public by a Googler, but it’s consistent with what we’ve been seeing in the “Penguin cases” I’m working with.

A number of folks who were hit had learned from some-guy-who-called-himself-an-SEO-expert that they should stuff keywords into the title attribute of images and links. That’s a Ridiculously Terrible Idea™ – and in fact, at the time, none of the major search engines was even indexing the title attribute as content. It was a total waste of time, at best. Which is what I’ve been telling people for years – “waste of time, don’t do it.”

For all I know, they’re still not indexing that attribute (they shouldn’t be, but we’ll run that test again) – but in the thread Barry found, the site in question actually uses so much text in there, that the actual text behind the “tool tip” isn’t even fully visible with the mouse pointing at it.

This points at Google looking for “spam signals” even in places where what you are doing would have no impact on their algorithms. Which the results of the Penguin update already hinted at pretty strongly, no?

As a result, I’m officially changing my recommendation from “stuffing keywords in your meta description means fewer people will click on your search listing, so don’t do it” to “stuffing keywords into your description tag is not just stupid, it’s also risky, so don’t do it.” Please adjust your explanations to clients accordingly.

Before anyone asks, you should also not “stuff” your keywords meta tag. If you have one. In fact, my recommendation remains “do not even bother using a meta keywords tag.”

That’s all – just thought this was interesting, and potentially important to some people out there.

Thanks!
Dan

Filed Under: Free Stuff

Hit By Penguin? Take the Red Pen Test

May 11, 2012 By Dan Thies 78 Comments

Like a lot of folks in the SEO world, I’ve been analyzing, dissecting, and pondering the significant changes that Google made in April 2012 – from Panda to Penguin and a lot of little things in between.

Leslie and I have the good fortune to work with hundreds of clients in our training and coaching programs, which provides us with access to detailed Analytics data on a large number of websites.

Within that group, we’ve now identified over two dozen cases of “confirmed” Penguin hits.
The analysis of those sites has proven to be very interesting indeed.

While much of the Penguin reporting to date has focused on inbound links as a potential issue, we’ve struggled to find many cases within our customer base who fit the bill for “unnatural” linking – possibly because many of our clients don’t really do traditional “link building” at all.

We have, however, in *every* case, found significant on-site issues that could be contributing to issues with the Penguin update, which Google described as targeting “webspam.”

“In the pursuit of higher rankings or traffic, a few sites use techniques that don’t benefit users, where the intent is to look for shortcuts or loopholes that would rank pages higher than they deserve to be ranked. We see all sorts of webspam techniques every day, from keyword stuffing to link schemes that attempt to propel sites higher in rankings.”

Source: Google Inside Search Blog

So how do you respond to this? Well, if you’ve gone nuts building unnatural links, I would encourage you to at least consider making some changes to your link building practices. However, if you are working with a site that has been affected by the Penguin update, now would be a very good time to look at potential on-site issues as well.

Take The “Red Pen” Test

Here’s a little exercise we’ve been having our clients do for a while. The purpose of the exercise is to identify potential “excessive” SEO on your website, and help you improve the site’s design to serve users better, convert better, and consistently rank better through algorithm changes in search engines.

What you will need:

  1. A couple of highlighters – I like to use green and yellow, but any two colors will do.
  2. A nice big fat red marker – the kind your mean old teachers used in school to grade papers.
  3. An open mind – you can lie to me, but don’t lie to yourself, because it’s bad for you.

Step 1: Print Out a Copy of Your Home Page

For many sites, you can simply print it out “as is” – but by doing so you may actually *miss* some things like hidden text, duplicated text, and links that are “concealed” by styling them to look exactly like the surrounding text. If you know how to disable CSS, you can print the page with all styles disabled, and get a pretty good printout to work with.

Most folks will find it just easier to use the text-only version of Google’s cached copy of the page. To find this, use the following procedure:

  1. From www.google.com or a Google search box in your browser, search for cache:URL or cache:www.domain.com
  2. For example, you can type cache:seomoz.org in a Google search box to get the cached copy of the SEOMoz home page.
  3. Click the little link that says “Text Only version” and print that out.

spacer
Not so pretty, is it? But this is a lot more like what your pages look like to spiders!

Step 2: Highlight the “SEO Keywords” on the Page

Okay – grab the printout, and pick up one of your highlighters. Start at the top of the page, and highlight every use of a “keyword” on the page, that isn’t absolutely required to let human visitors do one of three things:

  • Figure Out Where They Are: Am I on the right web site? Where am I within the web site? (Keywords rarely help with this)
  • Figure Out What To Do Next: Where do I click to find the women’s shoes? How do I sign up? (Keywords sometimes help)
  • Whatever Else The Page Is For: Are there any other things the page is supposed to accomplish? (Aside from “rank #1 in Google”)

Now, before you finish this task, open your mind, and think. If you sell “pool supplies,” and it’s obvious to your visitors that this is all you do, would the links to the “pool chemicals” category page *really* need to say any more than “Chemicals” in the anchor text? No, they wouldn’t – so go back with your highlighter and finish the job.

The point here is to uncover how many times you’re using keywords for the benefit of both you and your users, and how many times they’re really just there to “check a box” for SEO purposes.

That doesn’t mean you won’t ever use keywords in your copy, if it’s the right word to persuade a user to take action. However, if you didn’t need to use a keyword to say it, you probably want to highlight it.

It can be helpful to have your website open in a browser while you do this exercise – it can be amazing how many times you discover that someone stuffed every keyword you have into the alt attribute of an image, for example.

Step 3: Highlight the Excess Links on the Page

For the next step, switch highlighters, and start at the bottom of the page. Highlight every link on the page, that fits the following criteria:

  • Useless: Examples – a link to the designer’s site, to an “articles” section that people aren’t intended to read, etc. Does the link help your visitors accomplish one of the three obectives listed in step 2? If not, it had better be “required by law” or it gets highlighted.
  • Redundant: Links that already exist somewhere higher on the page, or links to redundant categories/pages that only exist to cover extra keywords with the same content.
  • Hidden/Concealed: You can see a blue underlined link on the “text only” printout, but you can’t actually see the link on the website in your browser.

If you use visitor analytics tools like CrazyEgg, you’ll find that a lot of the links you have highlighted during this step actually don’t get clicked at all.

Step 4: Time for the Red Pen Test (optional)

At this point you will have a page with, well, either a little bit of highlighter on it, or a lot. Scan, photograph, or copy the highlighted page for future reference… then grab the red marker, and run it over all of the highlighted text on the page. Stand back and admire your work.

Many of our clients have pointed out that this step is not strictly neccessary – with comments like “okay, I get it I get it” often occuring before they’ve even finished with the first highlighter – but there is a point to using the red pen.
spacer

Step 5: Analyzing Your Results

Of course, this exercise is probably worth repeating (at least mentally) with other pages on your site – maybe even all of them.

If you have more than a little bit of red ink on the page, there’s a very good chance that the following statements are true:

  1. You have been “doing SEO” on your site for a long time, and adding more keywords to more places helps you feel like you are “doing something” to rank better. You can find a lot of great tips and advice on SEO, different things that could help you rank better, but you’re not meant to do all of it – just enough to get the job done.
  2. Adding more keywords to more places is not actually helping you to rank better, and never has been. I’ve “owned” top positions in many markets for years, with a standing rule to never allow more than two exact occurences of a target keyword in on-page copy. In fact, I’ve got plenty of pages that have ranked for years with zero (0) occurences of the exact keyword in the copy, and no “unnatural” links either.
  3. You’ve been hit by Penguin, and/or Panda, and/or the new “keyword stuffing filters,” etc. – and you’re ready to make a change.

The truth is what it is: a lot of people have been going way, way, way, over the top with on page and on site SEO – even if they haven’t gone nuts with their link building. At this point, these practices are very likely to be hurting your rankings. If you’ve been able to rank in spite of these practices for a while, that’s great – but you knew it couldn’t last forever, right?

Consider the Following:

Panda, Penguin and the less-publicized “keyword stuffing” update from April are all based on document classifiers. Their job is to look at a document (your web page), and use a set of signals to determine whether it should, or should not, be identified as “low” or “high” quality, “webspam,” “stuffed with keywords,” etc.

The “red pen” test gives you some idea what the result is going to look like, and how they’re going to get there – because the job of these document classifiers, in a sense, is to simulate the outcome of a human being performing such an exercise.

If your home page came back covered in red ink, and you’ve been “slapped by Google,” now might be a really good time to turn the “on page SEO” all the way back to the bare minimum that is required for users, and work forward from there, with litttle changes that hint at ranking, instead of wholesale keyword stuffing on every page of your site.

If you have pages on your site – or entire sections of your content, that are solely devoted to stuffing in more internal keyword links for ranking purposes, now would be a good time – while you’re at “rock bottom” – to clean that mess up.

And if your business is going to survive, it might also be a good time to start thinking about redesigning your website from the ground up, with human uses and human intentions in mind. That’s another exercise (and another post…) for another day, but it starts with asking these questions about your visitors:

  • Identity: Who are they? How would they identify themselves?
  • Motivation: Why are they here? What do they care about right now?
  • Framing: How did they get here? Where did they come from?
  • Purpose: What specifically are they trying to accomplish?

The better each page of your site does at addressing these things, the better you’ll do in the long run. Let’s talk about that soon, okay?

Disclaimer: Author is not responsible for you ruining your pants or furniture due to leakage, seepage, or dripping of red ink from an ink-soaked printout of your home page. If your site is primarliy designed for keywords and not for humans, wear appropriate protective garb before performing this exercise.

Thanks for reading,
Dan

P.S. If you’re dealing with the Penguin, check my earlier posts on this for more detail, and don’t miss our Penguin “brain dump” webinar Wednesday, May 16, at 7pm US Eastern time:

  • Penguin Cases of Interest: Twin Studies
  • Quick Penguin Updates – What We Learned This Week
Filed Under: Free Stuff

Quick Penguin Updates – What We Learned This Week

May 11, 2012 By Dan Thies 2 Comments

Just some quick notes to keep you up to date. More coming, but the East coast folks are already demanding the answers that I promised to give today, so…

  • Penguin = Periodic Updates: Google did confirm what most SEOs assumed… Penguin, like Panda, involves an offline data processing function, which means that there will be “Penguin data pushes” similar to the Panda data updates which have been rolling out every 4-6 weeks on average.
  • This means that whatever work you may have done to “improve” will likely take some time to be reflected in Google’s scoring. Maybe even longer than you think.
  • If it takes a week to “process,” it probably takes a day or two for QA, so figure that the “Penguin data” will be at least 8-10 days old by the time they “push” it out.\
  • Your site is likely only being indexed and updated every couple weeks – the Googlebot may visit all day long, but they aren’t updating your entire site every day in most cases.
  • Add it all up, and figure that it’s likely a minimum of 3-4 weeks from the time you make changes until another “Penguin update” could possibly do anything for you. Possibly much longer.
  • Inbound links are a *part* of this – but not the whole thing. A combination of factors adds up to either work for you or against you in this thing. It’s not as simple as just one thing, but there are some things you can change. More on that (beyond what’s already been posted) later today.

If this doesn’t convince you that Google doesn’t care about what happens with individual sites, I don’t know what will. I am not saying that they should or shouldn’t care, or even that they are capable of caring… it just is what it is.

If this doesn’t convince you that you need to work to stay independent of Google, nothing will.

Other stuff…

  • I will be mailing another update out to our email subscribers later today, with some exercises you can do to examine your site. Getting on our mailing list would be a good idea. The form is over on the right of the page somewhere. It doesn’t cost any money.
  • We’re doing a more in-depth webinar with special guest David Harry of the SEO Dojo next Wednesday, informally dubbed “The Penguinar.” Leslie and I have been doing work on the on-site factors, David has been digging into off-site factors like links.

As with all of our webinar events, we will be recording, and we’ll get a replay posted as quickly as we can, so those folks who live in an inconvenient time zone don’t have to stay up until 3:47 am to hear what we have to say.

Thanks!
Dan

Filed Under: Free Stuff

Penguin Cases of Interest: Twin Studies

May 3, 2012 By Dan Thies 17 Comments

Like a lot of folks in the SEO world, I’ve been looking at sites which were “hit” by Google’s Penguin update.

At this point, I have done a detailed analysis of the first 7 cases (involving a total of 9 sites) on my list. In six of the seven, I have found what I could call “obvious” on site issues. Things like:

  1. Hidden text. Not necessarily “manipulative” but still, hidden text. 3 of 6 positive cases have hidden text that was easy for me to find, in no case was it clearly “manipulative” but it was clearly hidden.
  2. Alt and title attribute abuse – keywords stuffed into the alt and title attribute of images, and in the title attribute of anchor tags (links).
  3. “Holy cow” copy at the bottom of the home page and/or other pages, up to and including all pages. By “holy cow” I mean lengthy blocks of copy, stuffed with keywords and links, often in smaller type, gray text, etc.
  4. Hidden or disguised links – as in, the only way to know it’s a link is to mouse over the text.
  5. Blogs and “article” sections full of terrible content, stuffed with keyword links – where the links may or may not be disguised.

Since all of these things should be cleaned up regardless of whether the Penguin even cares about them, it’s been easy for me to advise the owners of those sites to clean that mess up. If you’re spamming, don’t wait for Google to tell you about it.

This leaves me (so far) with 3 very interesting cases…

  1. Two cases of “twin” sites – same owner, one site is “hit” by Penguin, and the other is not.
  2. One case where the site appears, on the surface, to be a “false positive” in terms of on site issues.

“Twin studies” are interesting in science and medicine, because studying twins (where you know the DNA is the same) helps distinguish between what is caused by genetics (nature) and what is caused by environmental factors (nurture).

In the case of the Penguin update, “twin sites” may tell us a lot about what’s going on, and what has changed in the environment. So let’s take these one at a time…

Case #1: The Hidden Text Case
Pretty simple – they have two sites in slightly different niches of the same larger market. The sites are linked together.

One of the sites has a significant amount of hidden text in the template, the other does not. The site with the hidden text lost significant traffic and rankings on April 24/25, the site without hidden text did not. The hidden text in this case involves the use of CSS to drop an image on top of the text.

This hints that hidden text is a bad idea. As if we needed a hint.

Case #2: Gray-on-Gray vs. Black, White, and Blue all over.
Similar to case #1 – two sites, same market, different niches. The sites are not linked together but the domain registration info is identical as is the physical address on each site, so it’s no secret that they are owned by the same company.

One of the two sites has “holy cow” copy at the bottom of the home page, text is in gray on a (lighter shade of) gray background. #666666 on #cccccc if you care. Links stuffed into the holy cow copy, are same color as the rest of the text, not underlined, the only way that a normal human visitor would know it was a link is by mousing over it. This site lost just under 40% of its referral traffic from Google at the Penguin update.

The second site has very similar “holy cow” copy, but the text is in black (#000000) on a white background (#ffffff), and the links are unstyled – that is to say, the links are blue and underlined, and obviously links. This site is up 3% since Penguin, but a sampling of rankings indicates no change, so this is simply a natural seasonal increase which both sites would have probably experienced. That is, if Penguin had not occurred.

This hints that disguising links is bad. As if we needed that hint.

Case #3: The prolific link spammer…
I mention this one because I simply haven’t been able to find any big issues with the site itself. It is possible that they are cloaking, and not telling me about it, because people sometimes do stupid things and lie to their doctor about what they have done. This site’s Google organic referrals are down by about 26% in the post-Penguin week vs. the week prior*. It’s very likely that I’ll find something “on site” once I dig a little deeper.

This site does have a pretty aggressive inbound linking profile – that is to say, a ridiculously implausible number of links coming in with anchor text precisely matching the queries where they were most affected. Whether this means that they have been “penalized” for over-optimizing their inbound links, or simply that part of Penguin is giving such links less weight in ranking, we can’t say. The latter seems more likely.

The truth is that we don’t even know if inbound links have anything to do with the Penguin thing. I know what you are reading out there. I read it too. Most of what’s being written is simply one person parroting what another person said.

The best bit of evidence about inbound links is inconclusive but highly informative, once you understand who the data was collected from, and how. Those who spam aggressively off site are likely spamming aggressively on site – correlation is not causation – tails do not wag dogs.

That doesn’t mean the data from Micro Site Masters is useless – far from it, it’s a strong hint that we should look at more cases like case #3 – and it gives us better ideas on hypotheses to test.

Sorry, no “miracle cure” today…

Anyway – sorry I don’t have a “7 steps to fix your problem” ready just yet. I don’t think it would be particularly responsible to post something like that, when we are all still trying to work out what’s happening.

However, given the clear number of positive cases where there was pretty obvious spam on the site, those who are affected might do well to consider whether their sites really represent “best practices,” or “what we thought we could get away with.”

We’ll have more for you soon – thanks for reading!

– Dan Thies

* At any change event, you need 7 days’ worth of data to have sufficient confidence about which search terms have been affected. We often see little bumps and bounces in rankings and search volume throughout the week, and it’s far more productive to analyze SERPs that we know have changed.

UPDATE: Just to be very clear – although I’m not convinced that inbound link spam is a factor in Penguin, it’s almost certainly a good idea to clean it up. If you’re going to clean up a bunch of link spam, document what you do – the links you were able to remove, the links you couldn’t remove and the reason why, etc. If you end up submitting a reconsideration request, it will go a lot better, and a lot faster, if you can provide a spreadsheet detailing what you’ve done.

Filed Under: Free Stuff

Understanding the Facebook Like Button: Not All Likes Are Equal

May 2, 2012 By Dan Thies Leave a Comment

That darned “Like” button is everywhere these days…

Matt Inman, famously of The Oatmeal web comic, and less famously (also formerly) of SEOMoz, lampooned the excesses of like-mongering in his own hilarious way, just a few weeks ago. That’s some serious funny.

What may be not so funny for marketers, is how much opportunity they leave on the table when they don’t understand how the “like” button works.

When I was in the 2nd grade, I passed a girl in my class a note, which said:

spacer
She read the note, but didn’t pass it back. All day, I was on pins and needles, wondering if she liked me… nothing. Then, after school, at the bus stop, she came up to me and whispered, “I like you… but I don’t like you like you.”
[Read more...]

Filed Under: Free Stuff

How to Retweet on Twitter – No, Seriously!

May 1, 2012 By Dan Thies 3 Comments

Okay, look – I know what you’re thinking.. “Dan Thies has lost his mind, and started building eHow-style doorway pages.” No. I promise. Here’s the thing…

Most people using Twitter – and sadly, especially, marketers – do not understand how the way you retweet impacts what happens down the line.

So indulge me for just a minute, and think about this. There are (nominally) two ways to retweet:

  1. You can push the “ReTweet” button, and the tweet in question will appear in your Twitter timeline, like this:
    spacer
  2. You can use the “old style” retweet, often described by Twitter clients as “ReTweet with Comment”
    spacer

Now, if you’re going with the “new” Twitter ReTweet Button approach (#1), you don’t really control anything, other than what time of day you push the button. If you truly have nothing to add, go ahead and push that button… well, maybe – but there’s one more thing to consider.

No matter which way you do it, the person whose tweet you have just retweeted will see your retweet in the “Interactions” tab on Twitter.com. However, in most Twitter clients the ReTweet-Button retweets will not show up as a “mention,” and the person you retweeted will not know you did it.

But hey, if you have nothing to add, and it doesn’t matter if the original tweeter knows you retweeted (I am pretty sure @CNN does not care if you retweet them, for example), go ahead and push that button.

In most cases, you can add more value by using the old “RT @so-and-so” style of retweeting.

Yes, it will increase your “Klouchebag” score, but that should be exactly as important to you as your Klout score, which is to say, not at all. Apparently some “purists” believe that the old-style retweet is gauche… a faux-pas… but whatever. We bailed the French out in two world wars, so they owe us one. I’m calling in that chip to keep the old-style retweets alive.

Once you see the different ways that you can add value by doing so, I think you’ll agree.

  1. You can add clarity. Check out this tweet, and consider what adding a comment like “Cool art project RT @sernovitz” to the start would do, to help people seeing it understand what they are clicking:
    spacer
  2. You can give credit where credit is due, and let the author of a story know you care:
    spacer
  3. You can also skip the whole “RT @” business, write your own tweet, and give a “hat tip” mention via:
    spacer
  4. You can add hashtags too, so if you’re participating in #SEOChat, and you want to throw an old Tweet into the conversation because it’s relevant, and give credit to the author and the person you learned about it from, you can do that too.

All of these methods, when used correctly, can add value to your retweets, by making them more useful, relevant, and interesting to the people seeing them. I’ve discovered many great SEO people on Twitter, because others took the time to add their handle, or the right hashtag, to a tweet or retweet.

As a marketer, you won’t be successful with social media unless you recognize what the word social implies. Social media is not a product, or an application – social media is people.

The “old-style” retweets allow you to interact with, engage with, and acknowledge the contributions of other people. This is more than a feel-good exercise.

Doing so actually helps to build your following, makes others more likely to retweet your contributions to the conversation, and creates that “I’ve seen you before and I like you” recognition, which you don’t get when people don’t see you talking about them.

Heck, I don’t follow “just anyone” on Twitter, but I do tend to follow the people who are retweeting my stories. I do so because they are very likely to be sharing similar stories, from other people. Stories which are likely to be of interest to me, and which I may not learn about any other way.

This post is long enough… so… let us know if you liked it, and if you’d like to see more of this kind of “foundational” content from the SEO Braintrust.

For a bit more on leveraging social media in your marketing and SEO campaigns, I posted a short video a few weeks back on Building Links with Social Media, which you may want to watch. It’ll take about ten minutes of your time.

Thanks!
Dan

Filed Under: Free Stuff

Adwords Alert: Changes to Ad Rotation Coming Next Week

May 1, 2012 By Dan Thies Leave a Comment

Google has announced, on even shorter notice than usual, that they will be changing something.

If you’ve been paying any attention at all, you know that every Adwords advertiser should be testing their ads.

One of the most critical parts of running any ad test, is the ad rotation settings in your campaign – if it’s not set to “Rotate Evenly,” you aren’t running a clean test. It’s that simple.

Well, Google has announced a change in the way that ad rotation settings will work, and it’s going to kick in next week. You can not stop this. You can not opt out.

Here’s how it works:

  • From “next week” (you didn’t expect a specific date out of Google, did you?) on, any ads that haven’t been changed in the past 30 days will begin to be displayed under the “Optimize for Clicks” rotation method.
  • This will happen, even if the campaign settings say “Rotate Evenly.” If any of your “rotate evenly” campaigns have ads that are already 30 days old, the change will happen next week on all of these ads.
  • If you never let a test run for more than 30 days anyway, then this will not affect you.
  • If for some reason you do need to run tests for more than 30 days, I would suggest changing a non-visible part of the ad (such as the destination URL) in a way that is unlikely to affect your results.
  • As an example, you could add a named anchor to the URL, so the destination (landing page) URL changes from example.com/somepage.html to example.com/somepage.html#1

For most advertisers, this is not a big deal, because you’re probably not running tests for more than a month. Maybe they’ll even be doing you a favor, if you somehow forgot about a test and left it running.

However, this would be a good time to review your ad testing practices, and to remember that you aren’t optimizing for “clicks” or for a lower “cost per click” – you are optimizing for profitability.

I’ll have more to say on ad testing best practices next week. For now, well, I just wanted to make sure you didn’t miss this.

Thanks for reading,
Dan

Filed Under: Free Stuff

The WordPress Plugin that Can Get You Penalized

April 30, 2012 By Dan Thies 41 Comments

Every month or so, we get a question from a student about whether we can recommend using the “SEO Smart Links” plugin for WordPress. The answer is always no, because the way it works inevitably creates a terrible user experience. Well, now we’ve got evidence that it can do far worse than that.

After reviewing the evidence, I am convinced that using SEO Smart Links can get you penalized by Google. This does not mean that using this plugin will instantly cause a penalty, but the way we see most people using it is extremely risky.

What is SEO Smart Links? Well, this plugin’s intended purpose is to improve ranking, by automatically linking every occurence of a keyword or phrase within your WordPress site, to a target URL that you’re hoping to get ranked.

There are undoubtedly some “smart” ways to use this plugin. I can think of several – like automatically linking a call to action (“click here to sign up”). However, right-out-of-the-box, it is one heck of a dangerous tool. Out of the box, in the free version, it doesn’t even limit the number of links it creates on a page.

Imagine what it would look like if every occurrence of “SEO” in this post were linked to our home page, or to some doorway page. To make it easy for you, I linked a bunch of SEOs (to class=”#”) already. Now imagine if we also wanted to rank some different pages for keywords that included words like [Wordpress], [plugin], [penalized], [user experience], [keyword], [ranking], [goog

gipoco.com is neither affiliated with the authors of this page nor responsible for its contents. This is a safe-cache copy of the original web site.