After careful consideration I have decided to transfer all hardware review activities to a new domain.  I purchased Hardwareasylum.com in 2012 and have been working hard to build a new and improved Ninjalane on that domain.  If you are reading this you have reached one of the archived articles, news, projects and/or reviews that were left behind during the site migration. 

Please update your bookmarks and be sure to visit the new and improved Ninjalane at Hardwareasylum.com

  • articles
  • general information
  • The Dark Art of Inflating Traffic
  • The Dark Art of Inflating Traffic

    Author:
    Published:

    Web Traffic Robot

    Consider the traffic robot scenario. 

    The webmaster decides how much extra traffic they want to have each month and then defines the path the robot will take thru the site.  Since bot traffic is actually rather expensive, in terms of server and bandwidth usage, the activity is spread out over the entire month to better blend in with normal website traffic.  Googlebot does this to ensure a speedy website while still getting the information it needs.

    So say you want an extra 60k unique visitors a month, the robot will spawn off enough instances to make sure the quota is filled.  To keep things simple in our example will only use one robot instance.
    A look at the numbers
    60000 unique visitors / 30 days =  2000 visitors a day
    assume 6 pages per visitor = 12000 pageviews a day
    12000 pageviews / 24 hours = 500 pages an hour
    500 page hours / 60 min = 8 min per page
    8 min per page x 6 pages per session = 48min time on site

    Take 48min and average it into say a 2min average for a normal visitor and you get roughly 25min.  Of course this is based on a random figure and assuming the robot is running 24 hours a day..
    Things to Consider
    There are considerable factors that play into this scenario that can alter the numbers.  The biggest would be community such as an extremely busy web forum.  In the case of website A, there isn't.  

    The next factor to consider is a meta refresh tag.  Meta refresh (meta http-equiv="Refresh" content="600" /) is an HTML tag that tells the browser to refresh the page every 600 seconds.  On the surface, that doesn't seem like a huge deal, but if you are a slow reader, the page can refresh on you and be rather distracting, if not annoying.  Most sites use this tag to refresh banners, which also gets them a free page view. 

    If you happen to have the site up and then leave for an hour, by the time you come back the page will have refreshed 6 times and increased your time on site by 60 minutes.

    Another factor to consider is a flash game or some movie running on the site.  Flash is mulit-media file and average time on site is based on loading pages within the session timeout.

    Finally the last option we're going to mention is a site monitoring application, these applications are designed to access a website and report back if the site is down or causing errors.  This would simulate a traffic bot but almost never changes its IP address so the traffic would not alter the numbers much.  Not to mention site monitoring applications don't execute JavaScript.