http://contempt.me/wp-content/themes/averin/images/microsite-728x90.gif

The Automation Mindset

(Note: I know I haven’t posted in a while, and trust me you have all let me know. I apologize, and hope this post somewhat makes up for it. I even included a diagram! Be happy! :D)

Often when I talk to people about what I do I can’t really sum it up with the term Blackhat SEO, or anything of the like. Usually it ends up being Automation and Scalability. First, I do want to say this though: automation is not simply a field or a thing to do, it certainly is a mindset. Once you start thinking in the right way with automation your mind hardly ever stops as you try to break down every little thing behind a certain task and make it all powered by scripts and programs. One of the concepts I’ve thought about lately when thinking about new scripts and programs was for niche research – but why stop there? Originally it was very simple.

  • Check root keyword search volume (organic).
  • Check average CPC on adwords.
  • Report to me.

The whole goal of automation is to take as much human interaction out of it as possible, and the goal of scaling is to be able to replicate this many times over and over. So let’s take this one step further and look at automating almost the entire process.

  • Check root keyword search volume (organic).
  • Check average CPC on adwords (if over a certain number, continue)
  • Check exact match domain names.
  • Find exact match that’s open on any TLD worth using.
  • Register using the eNom API or writing a cURL script through NameCheap.
  • Point DNS to my servers using API.
  • Create cPanel account via WHM on server to allow resolving and showing of said domain.
  • Use FTP to upload YACG and do the initial installation of it, posting adsense on pages for initial monetization.
  • Install in-text script for secondary monetization on terms in content itself, Kontera comes to mind.
  • Ping out said website, submit to a few aggregators.
  • Drop 100-200 backlinks (including deeplinks) to website.
  • Social bookmark a few of the pages for initial indexing (autopligg come to mind?)
  • Report to me.

See the huge difference when you go with the automation mindset and see how much is really possible? Total we’re talking about $2 cost per 500 keywords researched (Going under the 1000 captcha’s for $2 from Decaptcher). This could most likely end up being a $100 a day after domain costs, but the sheer numbers alone would probably start making it back via monetization within a week or two. Now, will this always work? No. Will it work for now? Probably yes.

Back to the original idea behind this post though, Automation is a Mindset. Once you really get on a roll you’ll start thinking about how to automate damn near everything. One way you could comparably think about what I’m talking about is a dish washer. Someone figured out how to make a device that washes dishes. That’s amazing! Helps everyone on a daily basis, but it would be in my mind to try to figure out how to create some way to get all of the dishes in the house into the dish washer so I can literally leave them anywhere and have them picked up. Robotics FTW. The bottom line is that anything can be automated even more and to a better extent, the question is how and how much.

Let’s go with another example, though. I drew up a diagram for a WickedFire thread that was posted in the Traffic & Content section over there about a SEO Network. Here’s the diagram, and then I’ll explain somewhat on how I automated it:

Now comes the somewhat impressive part. I have this entire system automated but the social bookmarking. Sadly I still have to type about 5 keys and click 2-3 times before the social bookmarking works how I want it to. The WPMU’s are all autogenned and uses the WordPress backup file to load about 150,000 markov’d posts delay posted over 3-6 months (think DataPresser but a little less clean on the output). The micro URL’s are added via the blogroll so they expand out with all of the new posts, and then a few of the posts are all bookmarked for fast indexing and incremental indexing with the growth. The micro’s themselves are basically article re-writes, only about 5-10 pages per site. These are somewhat of a way to “link launder” your links to your main site, so you can some-what clean them from completely spammy and automated to clean and respectable. After time if done right, the micro’s will end up around PR4ish with a gooooood number of backlinks, including deeplinks.

The whole point behind that example is that everything under “Money Site” is automated, and can be scaled out almost infinitely. That’s actually all that I have right now on this subject – but please for the love of God let me know what else you all want me to write about. I’ve been considering posting some scraping classes that I have and the like but I’m still not sure whether I should post them in Ruby or PHP, etc.

As always, hit me up on Contempt.me (Skype username) with some feedback.

5 years ago by in Blackhat Explained , Niches , Search Engine Marketing | You can follow any responses to this entry through the RSS feed. You can leave a response, or trackback from your own site.
About the

My name is Rob Adler and I'm an algo-holic. I spend most of my time coding, data mining, spidering and consulting for SEO. I hope the posts here are beneficial for you, and hopefully I can blow your mind every now and again.

15 Comments to The Automation Mindset
    • SniperRyan
    • Would love a good tutorial on using hpricot to scrape a big site. I’ll even give you the idea for the project I’m working on to get you started.

      Picked up a dropped domain with good PR and headstart of several thousand backlinks, which means I could get a big site indexed right away.

      The topic is books of a certain variety (like “american classics” or something), so I’d like to scrape the amazon DB for books of that type into my DB, turn the links into aff links, display some EPN links for used books, and update it when new books are found (maybe a weekly CRON that checks the latest feed results against dupes?)

      So, for your “automated mindset” perspective:

      * Get list of titles in a broad topic: american classics, classic literature, asian literature, etc.
      * Scrape titles, description, and Amazon aff link into DB (using Ruby).
      * Build site structure around the data (maybe using tags from Amazon?)
      * Check for updates and import them.
      * Search ebay for titles and display used options.

      A couple more thoughts to make it more powerful:

      * Autogen a mini-site for every title and have it link back to the page of the main site (maybe scrape a review and markov it 10x?).
      * Identify similar titles based on tags an interlink.
      * Identify blogs talking about the book and trackback them (similar to pingCrawl)

      If you want to make it bigger than my project, how about combining it with scripts in the WF war chest? Start with “literature” and see what G suggests. Then build a site on each genre, with the same structure as above.

      You could do subdomains or search for domains as you suggested in this post.

      Need to work on my PHP (or ruby) skillz.

    • Victory
    • PHP > Rails when replication is a major part of your operation, so i would suggest putting out your scraper class in PHP.

      Also you can’t talk about automation without talking about modularity. So talking about building and using modular tools would be intersting.

  1. Pingback: Automation, More Addicting Than Crack | Contempt

    • Shane
    • Hi –

      Spent quite a while going through your blog today, very good info. I have a question or two though pertaining to your diagram..

      What are the ‘micros’ are those sites where you own all 5 domains and have WP on them or some other relevant content with the objective of driving to your money site?

      I’m assuming the links from the micros to the money sites are actual affiliate links leading to the publishers page?

      Sorry for the questions in comments, just trying to connect a few dots.

    • James
    • Awesome post as always mate. What script are you using to get the social bookmarking done? You did mention autopligg but your diagram has other social bookmarking sites in it, so had to ask.

Leave A Response

* Required