Okay, I’m going to say something that may get me in trouble with the gurus and everyone who makes their living off of “secret” SEO strategies.
SEO (search engine optimization) is overrated!
There, I said it. Now I’m going to turn off comments here because I don’t want to get any hate mail.
Why do I say that?
Well, first, let me say that I’m not against SEO, per se. I mean, I want to practice good SEO. I’ll do things that make sense like writing good descriptions for my websites, and maybe putting my keyword in my title. Maybe. And I’ll keep an eye on keyword density in an article, but I won’t obsess about it.
You know why?
Because in 18 months all of the rules may have changed!
The search engines keep changing their algorithms. If it’s not Google’s “Penguin” update it will be the “Walrus” update next, or the “Wildebeest” update.
And what is the purpose of all these updates?
The purpose is to give people what they are searching for. The search engines are constantly updating their algorithms. (Algorithms are the program and set of rules that search engine companies use to determine the most responsive website to answer someone who is searching for a particular keyword.) When a website meets all of the criteria, it goes to the top of the organic search criteria.
Of course, this is a very valuable position, and when I can get a niche website to page 1 in Google it is a cause for celebration.
But what if you can’t get to the top spot?
Or what if you have painstakingly made it to the top of the mountain in Google. You have 97 niche sites. All of your traffic is organic. You don’t pay a dime for it. You’re making over $10,000 per week. You are teaching organic traffic seminars and webinars and charging $5,000 per seat for attendance. All of your dreams have come true!
Then, all of a sudden, they come out with the “Loch Ness Monster” update.
Don’t laugh. A lot of SEO gurus have had just this sort of thing happen to them.
So, let’s think about this. What is the reason the search engines are using algorithms in the first place?
Well, they could hire thousands of human beings to read every website in the world and vote on the rankings. That seems a little inefficient, though.
So the next best thing is probably to automate the process with computers. You give it some parameters like number of relevant back links, age of the site, keyword density. How long is the article? Is the keyword in the title? Is the keyword in the URL!? The bots are ecstatic! They’ve found a winner.
All that stuff is great, and I’ll keep on eyeball on it, but I’m not going to obsess about it. If Google puts me at the top of their pile it will be gravy, but I’m not going to obsess about it. I won’t be spending every waking moment wondering where I’m ranking.
I have this sneaking suspicion that someday, the search engine bots will be looking for professional SEO software plugins and will actually penalize sites that use them, thinking that they are just gaming the system. Maybe they will look for all sites that have a 2.2% and higher keyword density rating and toss them out in favor of sites with less keyword density, but more synonyms.
If you want to be what the search engines are looking for, then maybe the best way to accomplish that is to just have stellar, literate content. Maybe the best thing to do is to ignore the search engines and improve your writing skills, deliver superior information, and concentrate on connecting with your audience.
Now that is a revolutionary idea! 🙂