Advanced White Hat SEO Exists Damn It! - Dynamic SEO
Hello again! I’ve been restless and wanting to write this post for a very long time and I’m not going to be happy until its out. So get out your reading glasses, and I have it on good authority that every reader of this blog happens to be the kind of dirty old men that hang out and harass high school chicks at gas stations so don’t tell me you don’t have a pair. Get ‘em out and let’s begin….
Fuck, how do I intro-rant this post without getting all industry political? Basically, this post is an answer to a question asked a long time ago at some IM conference to a bunch of gurus. They asked them does advanced White Hat SEO exist? If I remember right, and this was a long time ago and probably buzzed up so forgive me, every guru said something along the lines of there is no such thing as advanced White Hat SEO. Now I’m sympathetic to the whole self promotion thing to a small degree. If your job is to build buzz around yourself you have to say things that are buzz worthy. You can’t say the obvious answer, YOU BET IT DOES AND YOU’RE RETARDED FOR ASKING! You gotta say something controversial that gets people thinking, but not something so controversial that anyone of your popularity level is going to contradict in a sensible way making your popularity appear more overrated than a cotton candy vendor at the Special Olympics. In short, yes advanced white hat exists and there’s tons of examples of it; but you already knew that and I’m going to give you such an example now. That example is called Dynamic SEO. I’ve briefly mentioned it in several posts in the past and it is by every definition simple good ol’ fashion on-site keyword/page/traffic optimizing White Hat SEO. It also happens to be very simple to execute but not so simple to understand. So I’ll start with the basics and we’ll work into building something truly badhatass.
What Is Dynamic SEO? Dynamic SEO is simply the automated no-guessing self changing way of SEOing your site over time. It is the way to get your site as close to 100% perfectly optimized as needed without ever knowing the final result AND automatically changing those results as they’re required. It’s easier done than said.
What Problems Does Dynamic SEO Address? If you’re good enough at it you can address EVERY SEO related problem with it. I am well aware that I defined it above as on-site SEO, but the reality is you can use it for every scenario; even off-site SEO. Hell SQUIRT is technically dynamic off-site SEO. Log Link Matching is even an example of advanced off-site Dynamic SEO. The problems we’re facing with this post specifically includes keyword optimization which is inclusive of keyword order, keyword selection, and even keyword pluralization.
See the problem is you. When it comes to subpages of your site you can’t possibly pick the exact best keywords for all of them and perfectly optimize the page for them. First of all keyword research tools often get the keyword order mixed up. For instance they may say “Myspace Template” is the high traffic keyword. When really it could be “Templates For Myspace”. They just excluded the common word “for” and got the order wrong because “Template Myspace” isn’t popular enough. They also removed the plural to “broad” the count. By that logic Myspace Templates may be the real keyword. Naturally if you have the intuition this is a problem you can work around manually. The problem is not only will you never be perfect on every single page but your intuition as a more advanced Internet user is often way off, especially when it comes to searching for things. Common users tend to search for what they want in a broad sense. Hell the keyword Internet gets MILLIONS of searches. Who the fuck searches for a single common word such as Internet? Your audience is who. Whereas you tend to think more linear with your queries because you have a higher understanding of how Ask Jeeves isn’t really a butler that answers questions. You just list all the keywords you think the desired results will have. For instance, “laptop battery hp7100″ instead of “batteries for a hp7100 laptop.” Dynamic SEO is a plug n play way of solving that problem automatically. Here’s how you do it.
Create A Dynamic SEO Module The next site you hand code is a great opportunity to get this built and in play. You’ll want to create a single module file such as dynkeywords.pl or dynkeywords.php that you can use across all your sites and easily plug into all your future pages. If you have a dedicated server you can even setup the module file to be included (or required) on a common path that all the sites on your server can access. With it you’ll want to give the script its own sql database. That single database can hold the data for every page of all your sites. You can always continue to revise the module and add more cool features but while starting out it’s best to start simple. Create a table that has a field structure similar to ID,URL,KEYWORD,COUNT. I put ID just because I like to always have some sort of primary key to auto increment. I’m a fan of large numbers what can I say?
Page Structure & Variables To Pass To Your Module Before we get deep into the nitty gritty functions of the module we’ll first explore what basic data it requires and how the site pages will pass and return that data. In most coded pages, at least on my sites, I usually have the title tag in some sort of variable. This is typically passed to the template for obvious reasons. The important thing is it’s there so we’ll start with that. Let’s say you have a site on home theater equipment and the subpage you’re working on is on LCD televisions. Your title tag may be something like “MyTVDomain.com: LCD Televisions - LCD TVs”.
Side Note/ BTW sorry I realize that may bother some people how in certain cases I’ll put the period outside of the quotes. I realize it’s wrong and the punctuation must always go inside the quotes when ending a sentence. I do it that way so I don’t imply that I put punctuation inside my keywords or title tags etc etc. /Side Note
You know your keywords will be similar to LCD Televisions, but you don’t know whether LCD TVs would be a better keyword. ie. It could either be a higher traffic keyword or even a more feasible keyword for that subpage to rank for. You also don’t know if the plurals would be better or worse for that particular subpage so you’ll have to keep that in your mind while you pass the module the new title variable. So before you declare your title tag create a quick scalar for it (hashref array). In this scalar you’ll want to put in the estimated best keywords for the page: [ Keyword1 -> ‘LCD Television’, Keyword2 -> ‘LCD TV’, ] Then put in the plurals of all your keywords. It’s important not to try to over automate this because A) you don’t want your script to just tag the end of every word with “s” because of grammatical reasons (skies, pieces, moose, geese) and B) you don’t want your module slowing down all the pages of your site by consulting a dictionary DB on every load. [ Keyword1 -> ‘LCD Television’, Keyword2 -> ‘LCD TV’, Keyword3 -> ‘LCD Televisions’, Keyword4 -> ‘LCD TVs’, ] Now for you “what about this awesome way better than your solution” mutha fuckas that exist in the comment section of every blog, this is where you get your option. You didn’t have to use a scalar array above you could of just have used a regular array and passed the rest of the data in their own variables, or you could of put them at the beginning of the standard array and assigned the trailing slots to the keywords OR you could use a multidimensional array. I really don’t give a shit how you manage the technical details. You just need to pass some more variables to the modules starting function and I happen to prefer tagging them onto the scalar I already have. [ Keyword1 -> ‘LCD Television’, Keyword2 -> ‘LCD TV’, Keyword3 -> ‘LCD Televisions’, Keyword4 -> ‘LCD TVs’, URL -> ‘$url’, REFERRER -> ‘$referrer’, Separator -> ‘-’ ] In this case the $url will be a string that holds the current url that the user is on. This may vary depending on the structure of the site. For most pages you can just pull the environmental variable of the document url or if your site has a more dynamic structure you can grab it plus the query_string. It doesn’t matter if you’re still reading this long fuckin’ post you probably are at the point in your coding abilities where you can easily figure this out. Same deal with the referrer. Both of these variables are very important and inside the module you should make a check for empty data. You need to know what page the pageview is being made on and you’ll need to know if they came from a search engine and if so what keywords did they search for. The Separator is simply just the character you want to separate the keywords out by once its outputted. In this example I put a hyphen so it’ll be “Keyword 1 - Keyword 2 - Keyword 3″ Once you got this all you have to do is include the module in your code before the template output, have the module return the $title variable and have your template output that variable in the title tag. Easy peasey beautiful single line of code.
Basic Module Functions Inside the module you can do a wide assortment of things with the data and the SQL and we’ll get to a few ideas in a bit. For now just grab the data and check the referrer for a search engine using regex. I’ll give you a start on this but trust it less the older this post gets: Google: ^http://www.google.[^/]+/search?.q=.$ [?&]q= ([^& ][^&][^& +])[ +](&.)?$ Yahoo: ^http://(\w*.)search.yahoo.[^/]+/.$ [?&]p= ([^& ][^&][^& +])[ +](&.)?$ MSN: ^http://search.(msn.[^/]+|live.com)/.$ [?&]q= ([^& ][^&][^& +])[ +](&.*)?$
Once you’ve isolated the search engines and the keywords used to find the subpage you can check to see if it exists in the database. If it doesn’t exist insert a new row with the page, the keyword, and a count of 1. Then select where the page is equal to the $url from the database order by the highest count. If the count is less than a predefined delimiter (ie 1 SE referrer) than output the $title tag with the keywords in order (may want to put a limit on it). For instance if they all have a count of 1 than output from the first result to the last with the Separator imbetween. Once you get your first visitor from a SE it’ll rearrange itself automatically. For instance if LCD TV has a count of 3 and LCD Televisions has a count of 2 and the rest have a count of 1 you can put a limit of 3 on your results and you’ll output a title tag with something like “LCD TV - LCD Televisions - LCD Television” LCD Television being simply the next result not necessarily the best result. If you prefer to put your domain name in your title tag like “MYTVSITE.COM: LCD TV - LCD Televisions - LCD Television” you can always create an entry in your scalar for that and have your module just check for it and if its there put it at the beginning or end or whatever you prefer (another neat customization!).
Becoming MR. Fancy Pants Once you have the basics of the script down you can custom automate and SEO every aspect of your site. You can do the same technique you did with your title tag with your heading tags. As an example you can even create priority headings wink. You can go as far as do dynamic keyword insertion by putting in placeholders into your text such as %keyword% or even a long nonsense string that’ll never get used in the actual text such as 557365204c534920772f205468697320546563686e6971756520546f20446f6d696e617465. With that you can create perfect keyword density. If you haven’t read my super old post on manipulating page freshness factors you definitely should because this module can automate perfect timings on content updates for each page. Once you have it built you can get as advanced and dialed in as you’d like.
How This Works For Your Benefit Here’s the science behind the technique. It’s all about creating better odds for each of your subpages hitting those perfect keywords with the optimal traffic that page with its current link building can accomplish. In all honesty, manually done, your odds are slim to none and I’ll explain why. A great example of these odds in play are the ranges in competitiveness and volume by niche. For instance you build a site around a homes for sale database you do a bit of keyword research and figure out that “Homes For Sale In California” is an awesome keyword with tons of traffic and low competition. So you optimize all your pages for “Homes For Sale In $state” without knowing it you may have just missed out on a big opportunity because while “Homes For Sale In California” may be a great keyword for that subpage “New York Homes” may be a better one for another subpage or maybe “Homes For Sale In Texas” is too competitive and “Homes In Texas” may have less search volume but your subpage is capable of ranking for it and not the former. You just missed out on all that easy traffic like a chump. Don’t feel bad more than likely your competitors did as well.
Another large advantage this brings is in the assumption that short tail terms tend to have more search volume than long tail terms. So you have a page with the keywords “Used Car Lots” and “Used Car”. As your site gets some age and you get more links to it that page will more likely rank for Used Car Lots sooner than Used Car. Along that same token once it’s ranked for Used Car Lots for awhile and you get more and more links and authority since Used Car is part of Used Car Lots you’ll become more likely to start ranking for Used Car and here’s the important part. Initially since you have your first ranking keyword it will get a lot of counts for that keyword. However once you start ranking for the even higher volume keyword even if it is a lower rank (eg you rank #2 for Used Car Lot and only #9 for Used Car) than the count will start evening out. Once the better keyword outcounts the not as good than your site will automatically change to be more optimized for the higher traffic one while still being optimized for the lesser. So while you may drop to #5 or so for Used Car Lot your page will be better optimized to push up to say #7 for Used Car. Which will result in that subpage getting the absolute most traffic it can possibly get at any single time frame in the site’s lifespan. This is a hell of a lot better than making a future guestimate on how much authority that subpage will have a year down the road and its ability to achieve rankings WHILE your building the fucking thing; because even if you’re right and call it perfectly and that page does indeed start to rank for Used Car in the meantime you missed out on all the potential traffic Used Car Lot could have gotten you. Also keep in mind by rankings I don’t necessarily always mean the top 10. Sometimes rankings that result in traffic can even go as low as the 3rd page, and hell if that page 3 ranking gives you more traffic than the #1 slot for another keyword fuck that other keyword! Go for the gold at all times.
What About Prerankings? See this is what the delimiter is for! If your page hasn’t achieved any rankings yet than it isn’t getting any new entry traffic you care about. So the page should be optimized for ALL or at least 3-6 of your keywords (whatever limit you set). This gives the subpage at least a chance at ranking for any one of the keywords while at the same time giving it the MOST keywords pushing its relevancy up. What I mean by that is, your LCD page hasn’t achieved rankings yet therefore it isn’t pushing its content towards either TV or Televisions. Since it has both essentially equaled out on the page than the page is more relevant to both keywords instead of only a single dominate one. So when it links to your Plasma Television subpage it still has the specific keyword Television instead of just TV thus upping the relevancy of your internal linking. Which brings up the final advanced tip I’ll leave you with.
Use the module to create optimal internal linking. You already have the pages and the keywords, its a very easy to do and short revision. Pass the page text or the navigation to your module. Have it parse for all links. If it finds a link that matches the domain of the current page (useful variable) then have it grab the top keyword count for that other page and replace the anchor text. Boom! You just got perfectly optimized internal linking that will only get better over time.
There ya go naysayers. Now you can say you’ve learned a SEO technique that’s both pure white hat and no matter how simple you explain it very much advanced.
Comments (931)
These comments were imported from the original blog. New comments are closed.
Thought everyone here would want to take a look at this software if your interested in seo? Aren’t we all.. Artificial Intelligence SEO
Make sure to reference me please.
This site help you to understand what is seo? and how its works.
Boiler Replacement
“15 epic minutes on the toilet!” ha ha
(Can’t beat some epic-ness. One day there will be a hollywood action film (with dramatic music etc..) of your trip to the crapper.)
I am just now beginning to appreciate this year what you wrote 2 years ago. I fear that this gem will need a year or so to germinate in my noggin.
I think I’ll run to the store, get some Smarties, and read it all again.
wow, that’s an advanced technique. I hope can read a lot more, now yu came back
PS: I forgot to do the sum (captcha), and lost my previous comment. That could be done better
Some thoughts/questions: 1. I suspect Google won’t like this and see this as a bad practice. Think it’s easy to trace. 2. For the best result I take it, you’ll have to use an already indexed site. That way you have an idea of what keywords work and should go into the list. => If your pages aren’t found, Google can’t supply extra keywords. 3. Extra nice feature on this: a. if the referer is google, store the used searchwords b. php/curl the google search with the keywords and trace what sites come up above your url. c. if your visitor visited these sites (css/history hack), inspect the keywords for these pages as well and maybe add them to the database.
Great post!
Westworld, for #1, how the hell would Google ever figure it out? Even so, they’d probably LIKE it because it’s giving the people what they want.
#3 is a nice idea.
Eli, a question - do you ever just create a new page to take advantage of keywords your script discovers, instead of modifying keywords from existing pages?
Google can compare cached pages with the current and detect that only a few keywords ever change.
Computer generated writing is nothing new(Markov chains,…). I’m sure that Google knows that we can swap words with Thesaurus and the like. Your technique isn’t very different.
Your intention is White hat, question is will Google see it that way ^_^
Excellent post! I reread the paragraph after the regexps in “Basic Module Functions” several times, even though I have 5 year of programming experience. You might wanna rewrite it (or maybe I’m not awake yet…).
I’m actually replicating keywords in my PPC LPs for Google’s QS, but never thought about optimizing my blogs for that kind of stuff, especially over time. Afk coding !
another thing I like to do on very large database sites is to try and give each of my subpages a link on the homepage for a while… call it featured shop, or most popular or something like that.. because you are tracking when the bots are visiting the pages its pretty easy to rotate them once the bot has visited the page a few times.. for each url in my database i have a field for msn,yahoo and googlebot which increments every time the bot hits the page.. once all three hit 5 (or whatever) times i just replace it with another subpage link until that gets hit a few times.. some people do this by just selecting by random every time the page loads but I like to make sure every page gets its fair share.. Its a good way to get deep pages indexed..
great post Eli.. nice to see you posting again
What the hell is this WH post on BHS???
I must say I am very upset with you for letting this get published… tut tut
J/K Nice post, Im not too sure on how exactly to pull it off but I would think if I read over it again it would help out in some scenarios.
Thanks again Eli.
Whoa, check out this related news today!
Google Analytics Opens API searchengineland.com/google-analytics-opens-api-17967
Now using Google Analytics API data too, also tied into the dynamic module, we can work even more metrics into this.
I see a MAJOR change coming in the way I do things…
Cheers
So basically, this is about tracking your visitors and AUTOMATICALLY using the data to rank relative terms on the page.
You DON’T want to use analytics on your pages.
The technique can be further enhanced by using TYPOS alongwith the plurals.
Thanks Eli,
Great post so nice to see advanced white hat being discussed.
The secound event of IM was much better than the first one.
your post is great, very good information.
Thanks for wonderful post
this will really help us
“Wow! You don’t know what you are talking about! Why guess at the keywords in the first place? This is where 97% of people fail when trying to optimize for a site.\n\nAny COST-WORTHY keyword research software will tell you the exact order of the keywords, the demand, the supply, plural ’s’ versus singular, etc.\n\nWhy guess? If you have the hard concrete info, you’re gonna go that much further.\n\nAnd yes you can put the period outside of quotation marks, if you are quoting someone else, a poem, movie title, etc.\n\nI appreciate the time you took to write this, but you may be confusing a lot of newbies and ‘oldies’ alike.\n\nThe proper way is to find a software that truly brings back the words so you know if the demand for lcd television is 3000 per month and 2000 supply, or if lcd tv has 2000 demand per month, but only 300 supply. I’d in no way want to guess that, I’d want to KNOW it, so that I can add lcd tv for the primary word, and perhaps lcd television for the secondary,instead.
I’d much rather compete against 300 websites per month; than 2000 per month.
Guessing at keywords only wastes a lot of precious time.
And how would this work if the keywords / traffic change over time?
I agree good keyword research is the place to start but this method means you can be lazy (which is good) and optimal over time (also good).
It also has possibilities for expansion none of us lot have even thought of.
“…it’s slightly too much trouble for most people to implement…”
I’m lucky beacause I make my websites in a way, that it’s not hard to implement something like this method.
wow eli!
i’ve never commented before, but this time i have to do it. this is a very clever way to do something that was in my head for a long time.
thanks a lot, i will post my experience if i can get some results.
greetings from spain!
priority headings are where you put the primary keywords as your first heading and secondary keyword as your second heading and work your way down from there with the subheadings.
Primary Keyword
some text blah blahThird keyword
blah blah more textSecondary Keyword
blah blah you understanding?fourth keyword
blee blop bloop text.holy pants and shirts… great post.
I thought I was advanced because I dynamically insert my keywords from PPC. im seriously at the kiddie table. this is beautiful.
thanks Eli, awesome, as always…
“Nice post Eli, I like ur style nephew.\n\nJust wanted to share something with bluehat lovers:\n\n2 month ago, I launched a new WP blog about cooking news (french language). Visits where about 15-20 /day, article post frequency was 2-3 /week, PR 2.\n\n1 week ago, I post an article on Senseo machines technical problem (they can explode). Article was 250 words + 1 pic, no keyword optimisation.\n\nThe same day, I received 400+ visits for this article, with the following 3 main keywords: - defaut senseo - senseo defaut - cafetiere senseo defaut\n\nI ranked as #2 for keyword “defaut senseo”.\n\nThe next day I received 380+ visits, still ranked #2 for “defaut senseo”. Title+H1 where “Problemes techniques Senseo, liste des machines avec le defaut”\n\nThe third day, I decided to update the post with a optimized title+H1. Post updated with title+H1 “Defaut Senseo, liste des machines avec defaut technique”.\n\nDay 3 visits: 120 No ranking at all for “defaut senseo”\n\nDay 4 visits: 15 Noranking at all.
I tried the following GG search: defaut senseo site:mywebsite.com
The post was here, with the updated title.
Damn.
50+ comments!!! must be a great post..
but, for the first time i got failed to understand a technique presented at this blog…
anyways.. will try once more to go threw it…
was waiting since long for an update from eli… thanx mann
write something new about ‘SEO Web Design’ if you can spare some time… eagerly waitin for new updates…
goodluck
Welcome back Eli, another triumphant return. sounds like youre building the “Terminator site” ..it never stops, until you’re outranked
at what point do you expect it to go sentient?
Hi Eli,
this is an interesting post that I think I understand. But I think you are missing out by posting simple diagrams of how things work.
Like a picture tells a 1000 words kind of thing.
Since some people are more visual in their thinking and not so good at reading.
Like they can visualize $ but may struggle to read the actual word Dollar maybe?
Personally I use Visio that I found on a CD at work many years ago before MS took over that software, but I guess you can use Open Office just as easily to make some flow charts etc.
Just some ideas to add graphics to posts.
Ned
No need to complicate it futher.
Actually the idea is very easy:
For each url of your website store the search phrase (keyword), that the visitor came from. Use these keyword(s) in many places - title, h1, backlinks… be creative. That’s all.
The main point is - make it all automated = dynamic.
Hi Neon,
How good if i use H1 tag with keywords & Link, in Top of my web page. Will it help my site for ranking..?
That is NOT the point of the article. The automation is not intended to replace the keywords based on what people typed in to find your site.
Rather, the automation is to re-order only the priority given to a hand picked list that you have identified as being ideal keyword targets for your site page. You discard the information for keywords outside that list, but use the matching data for keyword driven visits to adjust the presence of your keyword short list in your core seo asset locations (page title, h1s, etc) for each page.
However, as others have noted, this technique is only as good as your initial research, and really only valuable on larger sites or sites where you don’t have the time to handle it all manually by comparing traffic logs to your initial implementation.
I thought that at first, but after 20 mins I’d got the basics working… Although last night I took down my VPS due to incorrectly nestled braces - joy
I suggest just giving it a go!
Eli,
I was doing something quite similar. I store all keywords along with url. Then, I create a webpage for each keyword with a similar content and publish it in a different domain.
Would it be OK if I translate this post in Turkish and put it in turkish webmaster forum? Of course, I will cite this page as source and provide a link.
Thanks,
Surprised to hear you say this “It’s important not to try to over automate this” … I’d also be surprised if Meemo doesn’t have an IP cloaked dickroll next time he shows up here.
Sub-pages on my site tend to be root phrases of the long tails, so “shiny blue widget” and “overpriced blue widget” are on the “blue widget” page with /blue-widget/ in the url. Title, header & portions of the content are switched between all depending on recent SERP visitors.
I say recent visitors because I let tracking expire after 30 days. This is more beneficial than having a accumulated traffic over the years stand in the way of an up and coming short tail.
I was just thinking about this.
How best is it to maintain this over time? In my current system i just increment counts for keywords, I dont timestamp it. This means in a years time I may have a new keyword appear giving me loads of traffic but my results are skewed by old data.
Is it best to store each individual search, timestamp it, then only reference recent data (past month). Or is there another way? I am just a little worried about so many inserts into the DB if I do it the timestamp way.
I use date(’Y’) + a 3 digit date(’z')
Then when I run the sql to update the keyword order of each page I only pull visitors if the column ‘year_doy’ is greater than today - 30 days (considering year changes). This ignores old traffic.
The easiest way would be to just record the month and update your keyword order when the month changes … but I like flexibility.
This also allows me to link to “hot pages” on the home page with the appropriate anchor for the big traffic internal pages from the past 7 days. That extra link juice will push it up the SERPs just a smidgen more.
Very good article! I have had thoughts about this way to do it but lack in programming
“Who the fuck searches for a single common word such as Internet? Your audience is who.”
That will be a classic!!!!
And Bulk - very interesting
Hi Eli,
Good to hear from you again with such informative stuff. Gonna read a few time to really grasp the whole thing.
Looking forward more in the future.
This article was so over the top. I didn’t understand most of it. I like to read stuff I kinda understand and then later on often I will get it in a flash of understanding. I think this post will take a double flash for me to get.
Guy
Great post Eli. I have looking into something a bit like this and in my research I found a site with a big list (altho a bit outdated) with searchengines and the searchquery. science.co.il/analog/SearchQuery.txt
Using that data it will return more searches and thus keyword “hotness” then only Google, MSN and Yahoo. Especially if your site is in something different then english the searchengines that deliver more traffic can be something else.
“Ok so I thought because you were so amazingly generous sharing this very valuable data with this community I want to tell you where my mind went when I read this and share my ideas for this system with everyone. Brace yourself this might be long.\n\nSo I liked everything you had to say but I thought I would make my database like so:\n\nID | URL | Keyword | Count | Date Crawled | Search Engine | Keyword Priority Switch Date | Visitor IP\n\nThose are all columns in a table.\n\nNow I would do everything you suggested by just recording keyword counts but the added features such as date crawled would allow me to incorporate the automatic system for posting a deep link to unfresh content so that search engines can get to it easier.\n\nI was also concerned about the keywords that get added or used in case of either competitors manipulating my system so I was thinking I would build a back end that shows me a table of keywords that have been occur in the top 5 for each page. I can thenreview the ones that show up and either demote or remove them if I deem them inappropriate allowing me to moderate the automation.
I also added the visitors IP address so that I can count only 1 unique visit per every 24 hours to ensure my competitors don’t hit the back button on a porn based keyword and mess up my website.
I would also generate a dynamic sitemap like you suggested for having a optimized internal link architecture with the top keywords per url as the anchor text.
Immediately I thought of incorporating this into a backlink system I have already built which would help AUTOMATE OFF PAGE SEO!!! I know sounds cool but how? I would take the top searched keyword dynamically for each page use the Yahoo BOSS search api to search for that keyword in yahoo. I would then take the top 3 results (unless of course one of those is mine) and use the BOSS api again to gather its backlinks. I would then query the page level pagerank of each of those backlinks then select a range\of them by having a pagerank range (PR 1-6) or something like that cause pr6 + is usually digg or something and they dont respond to link requests. Then I would have an email system that sent out link request BY KEYWORD ANCHOR TEXT!! to those sites that would then just get me a ton of awesome links inbound to my site. Heck you could even have it automatically create a link page that links out to all those backlinks so that your already linking to them before you make your request!
I also thought it would be interesting to track the conversions of each visitor that enters the website so that you could not only determine which keywords were highest search volume but also which ones were highest volume and converting!
One last thing was I thought about the idea of resetting the count after every 30 days so that the data would show popular keywords as well. My issue is then your not using your old data which is just silly. Why not take the data you have already reduce it into a roundedraction of a 10 point system then reinsert those counts back into the database every 30 days. That way if something was popular it would replace the current winner easily but still overtime you would have the most effective keywords showing up in case there was a single one that performed consistently always.
YAY!!! lets all quit our jobs this system just made SEO’s not very useful anymore haha.
What does everyone think of these tactics?
Hi all,
Sorry for the noob question:
but if you have a module that is continuously re-writing your page URLs based on the optimal keywords at the time, doesn’t that also continuously break any backlinks to that page?
i.e., if you have a site linking to “yoursite.com/lcd-television” and then the next day your dynamic SEO module changes the URL to “yoursite.com/lcd-tvs” — haven’t you just wrecked your off-page SEO?
Any insight appreciated.
@Justin: Not really.
First, he doesn’t mention rewriting the URL and second, rewriting the URL shouldn’t break the site anyway. Wordpress’ permalinks don’t break if you change them later on, for example.
If you’re concerned, one technique is to just have the post/page ID in the beginning of the URL followed by whatever keywords you want, and have your posts only being looked up by ID instead of permalink. It’s much faster that way too.
url.com/54-lcd-television url.com/54-lcd-tv etc
@Eli This technique is very impressive!
I’m contemplating whether one central database is the way to go, vs having individual (possibly SQLite or CouchDB) DBs.
You don’t want to be changing the page URLs. Like Justin said, if you change the URLs, then links from other websites will be broken. Like zOMBIe said, links within your website will probably work because the CMS (Wordpress, for example) will correct all internal links to that page.
I don’t think Eli was talking about changing URLs. He was talking about changing other content on the page, such as the title (H1 tag), the title meta tag, description meta tag, etc.
Yes, long post but absolutely excellent technique. I would love to give this a try and play around with it.. if only I had more time =[
@Justin - I think he was saying you would rewrite the Htags, not the actual URL. If you do rewrite the URL you would break the links, you’re right.
When will you post next. Waiting since so long.
Kindly write something on seo friendly web design.
Once again, great post!
This is a great idea … let the engines/visitors do the work for you… great stuff.
And I understood all of too!
Cheers Eli, Top stuff.
Now, if I could only understand the rest of your posts
Great article, I always thought that classical seo is overrated. I mean without good content, there wont be results, no matter what seo freelancers or companies are saying.
on other hand if you have great content, Google will do some kind of dynamic seo adapting to your site. visitors will be doing seo …
Great post!
This is good stuff for me
This is probably the best blog I’ve ever read…I suck at programming so most of the times I have to invent “my own way” to use your techniques, but the ideas are so good you can find thousands of ways to apply them with excellent results.
Keep up the Wonderful work!
Anyone kind enough to post a dummy guide on this “Dynamic SEO Module”things.
For others stop saying ,”nice post”,”keep on posting” I think you haven’t finished even one paragraph.
Eli,
I think you’re cool with the period (.) outside the quotes. I’ve heard different ways, but I believe they must always be inside in British English, but in American English they only go inside if you’re actually quoting another sentence.
Those who don’t like it should definitely chilax.
Eli,
This is a really good automation! Could you explain a little more detailed how you do this:
“long nonsense string that’ll never get used in the actual text such as 557365204c534920772f205468697320546563686e6971756520546f20446f6d696e617″
Does it mean that you deliver different content to users and to search engines by removing the nonsense string from the user version with jQuery or something? I am asking this because I can’t imagine at first sight how a string will influence KW density if it is not used in page text.
Hi Elie, thanks for the great information. Seems like i got to read a few more times to really implement it.
A different approach of seo that i don’t get to see at other site!
wow great article, definitely a lot of great content in here for webmasters everywhere to read!
Definitely a new approach to SEO!!! Thank You!
I miss the good old days when you could build it and they would come.
The search for perfect Seo keeps changing. Rick
Wow Eli, this is powerful stuff, but are you holding anything back on us?
Web SEO Company
“see, the problem is you.”
ain’t that the problem with so many things these days? especially in the SEO world.
that line made this post one of my favorites…
and it made my day.
bravo sir. bravo.
Well I for one am just glad the problem is you, not me!
just kidding, haha, you are right, that line is appropriate most of the time.
Would it be OK if I translate this post in Turkish and put it in turkish webmaster forum? Of course, I will cite this page as source and provide a link.
Thanks,
Hi anyone interested in link exchange with seo,webdesign,link building sites
CONTACT: mamta@netzcalibur.com I have sites of different themes
Dynamic Seo sounds very good but is extremely hard to implement when you have hundreds of websites. We use general rules when optimizing our web sites and although they are basic they do work.
If you follow the above you cant go far wrong.
@Car Hire Johannesburg Airport
agreed. the rules you stated still work, we have to keep it but we must also improve and try new strategies.
Thanks, I’ll be adding this to my own website once I get done with some other projects I have going.
Where did you learn all of this information anyway?
Google can compare cached pages with the current and detect that only a few keywords ever change.
Computer generated writing is nothing new(Markov chains,…). I’m sure that Google knows that we can swap words with Thesaurus and the like. Your technique isn’t very different.
Very interesting and indepth article.
I like the comparison between short and long tail keywords and their despite there being obvious traffic advantage with short tail, the long tail tend to be much easier to rank for as well as the conversion ratios tend to be alot higher with long tail phrases.
That was excellent…
Thanks for posting such a great article…
Wonderful post - I’m definitely going to review your previous posts.
Thanks!
m didt read this one before…This is a very wonderful blog.
Keep on!
Excellent Post,
I Think this is so interesting way of SEO.I think it’s very nice article,from this article i got useful information.
Thanks!
This was an outstanding article I’ll say. However…. I did notice that you put the period outside the quotes and I’m most disturbed…
:p ———— Rebekah the conqueror at WebUnlimited
This was an outstanding article I’ll say. However…. I did notice that you put the period outside the quotes and I’m most disturbed…
:p ———— Rebekah the conqueror at WebUnlimited
Vietnam 4 while we wait
game nau an
thanks for the tips of information!
mago
Hello,
Thanks for such useful information on SEO.This will help me in future.
Thanks Again!
really great article about white seo its 10+ work thanks for the good information Thanks for sharing us..
Regards,
SRGIT
I allways use title tag with main keywords then H1 tag with the same, and then use keywords in copy, which seams to work.
Thanks for the other information.
Hello nice post,
Thanks for providing such useful information seo, that i was looking in the article.
Thanks Again!
We like its tec please keep its
Thanks!!
Hello
I liked your Advanced White Hat SEO Exists Damn It. This is really Dynamic SEO for me.
This is essential for me.
Regards Jeff
Create and deliver unlimited powerful email campaigns and newsletters Obongomail.com provides cost-effective and reliable email marketing services for small businesses world-wide.
Email marketing feature include:
Automatic email import or upload Hands-free unsubscribe management Unlimited message personalization Deluxe campaign scheduler Unlimited email lists & subscribers Detects duplicate and delete bad emails Built-in HTML editor Remote access from any web browser and many more!
obongomail
I love your blog. I will have to go through and bookmark accordingly.
Cheers
I am waiting for your next blog post.
SEO training course, Internet Marketing Training Course. Affordable Internet marketing & SEO training resource. Learn secrets to generating free web traffic. Internet marketing training | SEO Training online.
Good post for SEO Peoples
Abhijit Seo pune
A very informative article. As a fledgeling SEO consultant I am always on the lookout for new SEO tactics and practises.
If you are looking for an SEO Consultant in Cape Twon look no further than SEO Samurai www.seos
Reply to this comment
I don’t get one element of this concept. I understand what the code wants to do, the database updates, the template tag replacements but I miss something in the final analysis and my pondering is this.
I have default keywords that appear when there is no search term. Those keywords are chosen by the database. The most popular term for that URL (that page on my site) is what is put in there by default.
That really is the END point of the script. To have the most popular term being the default for the casual surfer landing there and the spiders indexing the site.
I will check back to see if anyone confirms what I typed or corrects me.
L.Croonquist Modern Web Hosting
Funny, eh?
What’s hilarious is Eli’s comments have been taken over by spammers. Get it together dude…this site used to be great.
Dynamic seo sounds good, going to give it a try.
Resell ebooks
Hope this dynamic strategy work.
Death Knight Leveling
SEO Modules - heard of them for the first time today.. and it is too complicated for me. Why not spend the same time writing articles..which is sure shot whitehat ?
P.S: Loved the brainwashing ad.Do you get clicks on the brain washing ad ?
See wow level guide horde - I will tell you everything about it, im also sharing my experience with a leveling guide called zygor guides and an how to.
thx
Autommated SEO sounds like science fiction and a long way away. I will keep checking in though this is very interesting.
for premature ejaculation
Dear Sir / Madam
Great information and Good posts , Wish i saw this blog before. This wonderful writing helped a lot. Have a nice day.
Thank you
This article is very important, I thanks so much.
concursospublicoss.com/comofazerumaboaredacao.php
Maybe you’re right. I will follow your thoughts on twitter.
Thanks, Wei,(FROM Nike free kids)
“ITFlux is a leading software outsourcing company offering outsource software development services across a range of technology expertise areas including commercial and open source software technologies.Some of our main services in the outsource software development arena.\n\nSmall Businesses world over have become more dependent on Information Technology than ever before. This has also resulted in the cost of IT support go up significantly.ITFlux’s Small Business IT Outsourcing services helps you to reduce cost and get a reliable support company to give you a well rounded outsource solution for your Outsourced Software Development and Support needs.\n\nAs more and more software product companies and application service providers compete in the market, cost has become a major factor that determines competitive advantage.Engaging ITFlux Outsource Programming Services can provide you with a cost advantage that will enable to acquire more customers and beat your competition.Our Offshoreoutsourcing company will help you to not just lower costs but also serve your customers through a wider range of technology offerings.
ITFlux has has great success as a custom software development company that continuously innovates to offer its customers the best choices in custom development.We believe that the aim of a offshore software Development Company to not just provide custom software but also custom engagement models that will meet client’s specific business needs.
Finding the best outsource solution requires clients to understand the software outsourcing company to an extend that they can partner on a long term basis.While Offshore software outsourcing can be done on a short term basis, the real benefit comes from using a partner in a long term business engagement.
ITFlux ERP expertise includes end to end ERP implementations for the SMB Market using the popular open source ERP solutions OpenERP. As more and more companies are moving from their existing application\packages to more robust Enterprise Resource Planning Systems, ITFlux is a fore runner in customizing and implementing the Free OpenERP solution for a factor of the cost for traditional high end applications.
Finding the right partner from the thousands of software development companies in the world can be challenging. Understanding this challenge well, ITFlux provides solutions that are customer centric and cost effective.
Space #6, Phase #1 Nirmal Infopark KINFRA Kakkanad, Kochi - 682030 Kerala, India
Phone: 04844058000, 4058002, 4058003
Of all the marketing methods available to the serious minded business owner wishing to build a solid and sustainable web presence, SEO followed very closely by social media marketing is definately the No.1 strategy to choose.
Thanks for taking the time to write this lengthy article.
Thanks for sharing, it is very useful for the sites, keep it up
Reptile supplies
Awesome… I’ve been waiting for some good reading material since… Nov 2008!
thanks Eli!
Error: please enter a valid email address.
;lksjdflkwejflkwemflkmwelfnwe;lfnlknf
That software site looks so cool. You have a lot of good SEO tips in the article. Thanks! I’ve been looking for how to improve my SEO.
Many thanks for taking the time to discuss this, I really feel strongly about it and love learning much more on
this topic. If possible, as you gain expertise, would you mind updating your website with a lot more facts? It’s extremely helpful for me.
This is an excellent blog, and I getting an advance SEO technique from your article. Also I m waiting for next update.
Thanks
Hello
It’s amazing ideas! I really fathom about winning over the competitors in a way of shutting them down. What a unique strategy you convey on this site.
Thanks
Hi
I would also suggest that it is okay to outsmart our competitors but if you do things like shutting them down politely creates no difference if you do it literally as the main intention is the same. Hence it is unethical business practice so to speak.
Thanks
Hi
I may have some contradicting comment but that does not mean that I dislike your blog. I want to you you for sharing the wisdom here.
Rod
Nice article . Today I learn what is seo details.
Thank you.
Just awesome and fabulous! This site is PR5!
Make Money href=”makemoneybee.com” title=”make money”>
Very interesting post, i’ve heard a lot of rubbish about ‘Advanced White Hat’ which often end up being standard techniques, or methods with no evidence.
A great read!
I think your seo post on SEO is fair but you need to post a little more on this topic if you want to help more people. SEO and internet marketing changes all the time. You have some good points and its great to see you lending a a hand to people wanting to do it the right way.
Good job Blue hat SEO
I have always hated black hat techniques they have hurt me and others I know severely in the past. It’s great to find an article discussing white hat SEO techniques for a change. Thank you!
google sniper
great post
This is a very informative article.I was looking for these things and here I found it. I am doing a project and this information is very useful me. If you are interested, but this is my duty to inform you that virtual administrative assistant a very dedic
Information about India tour or best time to visit in south India, north India and Rajasthan, Kerala. Visiting guide about India holidays packages
India tour, India tour packages, India tour operators, India holidays packages, South India tour, North India tour, Rajasthan tour, Kerala tours, Rajasthan tour packages, Kerala tour packages
HTO India - Tour Operator in India, Offering India tour and customizable Indian Tour Packages for the off the beat travel paths in India and its subcontinent. This Indian Tour Operator is offering the personalized packages tour in India, Nepal, Bhutan, Sri Lanka and Tibet.
India Tour Operator, India Tour Packages, India Holiday Packages, India Tour, Packages tour of India, Indian Travel Packages, Tours of India, South India Tour, North India Tour, North East India Tours, Indian Travel agents. India tour packages, India travel tour”
Inside Indian Jungles - Wildlife in India, Wildlife of India offering India wildlife Tour for National park in India and tiger in india. This willdife parks in India is offering the personalized India wildlife tours, india wildlife, wildlife tours India.
Wildlife in India, Wildlife of India, India wildlife Tour, National park in India, India Wildlife, India wildlife tours, tiger in india, Wildlife tour India, wildlife tours india, willdife parks in India, bengal tigers
It is a very nice thing to see your excellent work and I like your article very much. With your rich knowledge, we can learn more from your wonderful post.
HP Pavilion Problems
Thanks Eli,
Great post so nice to see advanced white hat being discussed.
That is an interesting post i have to say that is a good job
Running Free Manila
Congratulations on your article about white hat SEO.
Organik SEO Solutions