Wikipedia Links pt 2 - The Semi-Automated White Hat Way
Upon special request from Robby I am releasing the second part to my Wikipedia Links series a bit early. This post will detail a bit more advanced way to gain links from Wikipedia and takes a step forward in the complicated sector. So if you’re not ready for it please get ready before attempting. It will also require knowledge of coding so don’t let that take you by surprise.
The Objective
The objective of this technique will be to semi-automate the process of submitting your content links to Wikipedia. There will be two backends. The first will be used by your content writers(you if applicable). The second will be the actual process of submitting your links. Also note that when I use the term content writers, I am referring to you or anyone else that writes the content for your site.
The Process
1) Create a subsection on your domain where your content writers can post their articles. Create this much like the first part of this series specifies. Make a backend for the content writers to post the articles so they don’t have to do it manually or through the CMS. It also helps to put in a good WYSIWYG editor for their benefit. The form for the article will need to include the category the article should be placed in(a category editor is a nice addition), the title of the article, the article itself, and 6-10 keyword boxes.
2) After the content writers have written the article and filled out the form and clicked the nice big submit button you will need to further code the next page. Through the script have the next page pull the en.wikipedia search for each one of the keywords specified. Parse and list the results with check boxes next to them. That way the content writer has the option of checking all the places on wikipedia they would like the article they just wrote to be submitted to. It wouldn’t hurt to write some limits on this portion to keep it simple and the spamminess of it down. Limit the keywords they can use to describe their article and limit the results it’ll provide from the search.
3) After they have selected the Wikis they want to link to the article they press the final submit button. This will then publish the article on your website. Then it will LWP the wiki edit page which will be formatted something like http://en.wikipedia.org/w/index.php?title=Title_Of_Wiki_Article&action=edit
The easiest way is to just grab the title of the article, remove any redundant phrases like “Wikipedia, the free encyclopedia” and syntax replace spaces with underscores. Pull the article source and regex for ==External links==. Below the external links insert
- [http://www.mydomain.com/pathtoarticle Article Title] -Could also put in a short 100 char description here if wanted.
4) Then you just have to post the form, and double check the article for the change. Always! be sure to double check, because if your script messes up the Wiki article that means big trouble. So definitely test before you play, and every time you double check make sure you save the old original copy so if it makes a big mistake you can easily and quickly put the old article back in and find where your mistake was. Also be sure to check for a few human errors that may mess up the posting to Wiki. For instance make sure they can’t put *’s or ]’s in the title or html for that matter. That’ll cause some major problems with the Wikipedia insert.
Well there you go. I know this is rough but its the best I can do short of posting the actual code (my answer is no for those about to ask). Just use it wisely and test test test before you actually use in mass. This is a semi automated method and is very fast and easy. If you are an idiot and amateurish about it you’ll ruin it for EVERYONE very quickly. All in all this is a great way to automate the aspects of your promotion work and the work of the content writers for your site. Over time once the content writers get the hang of it, it creates a great way to get a ton of high PR links. Once created and in practice it will shock you on how fast the empire builds using this method. Just remember what I said be responsible with it, not every article you write should be submitted to every search result. Being discrete is the key to success here.
Comments (95)
These comments were imported from the original blog. New comments are closed.
Oooh. I just wet myself in excitement
Andy
Ok, you have to define spam here. If the articles are relevent and well writen, then just because your using it for marketing techniques, doesn’t make it spam.
Thats the key. those articles MUST be relevent and fit the wikipedia article!
Interesting Idea!
Regards!
pua tips, yeah php is a simple language.
Rett, there are some king of immidiate
Austin, no they won’t
great idea.
nice white hat method. keep posting!
I think these days the Wikipedia editors are more concerned with removing links to commercial content, or if it simply looks like you are promoting your own site. I also think that they probably get a list of the most recent edits to “approve”, rather than just randomly patrolling through stuff. I’ve had some stick and some get removed. Here’s a few tips for getting wiki links to stick.
Don’t post too many from the same wiki account. If all you’re doing is posting links, they’ll just wipe them out.
Don’t put your ads on your pages until after it seems likely that your wiki links have been moderated. I’m just guessing on this btw, but from experience, once your link has been up for a while, it’ll stay approved.
Your pages have to be good content, if you’re not capable of creating stuff that looks like reference material to whatever the niche is you’re in, might not be best for you.
Eli I remember you said in other posts that one really good link can give you a really good boost. I’ve noticed one good wiki link “seemed” to give certain pages a google boost, making me question whether all their talk about no follow might be smoke and mirrors after all……. who knows though, any mroe thoughts on this?
Even if they are no follow, you still get consistent, highly targeted traffic from wiki pages. The whole thing ultimately is a testament to Google’s bullshit… as you said a while ago, a shithead subpage on an old domain should not be worth more than an entire site dedicated to a subject, but that’s the only way google can cover for the fact that it can’t actually detect “quality” - so they have to measure age, popularity, stuff like that. The entree is still boiled dog.
keep it up
thanx
thanks man
it’s very good article