Get Virtual Access to ProBlogger Training Event 2016

Buy Now

ProBlogger Event Virtual Ticket

Buy

Give me 31 Days and I’ll Give You a Better Blog… Guaranteed

Check out 31 Days to Build a Better Blog

Give me 31 Days and I’ll Give You a Better Blog

Check it out

A Practical Podcast… to Help You Build a Better Blog

The ProBlogger Podcast

A Practical Podcast…

FREE Problogging tips delivered to your inbox  

Filtering Out Google Analytics Junk to Read Your Numbers Better

Posted By Guest Blogger 16th of January 2015 General 18

This is a guest contribution from Larry Alton.

Web developers, content managers, marketing teams, and many other online professionals rely on Google Analytics to understand visitor trends. However, you can run into a significant amount of noise, which can skew your Google Analytics numbers and your subsequent interpretations of this data.

Luckily, you can filter out certain types of traffic, so that your numbers don’t get watered down by your own traffic, Web crawlers, or duplicated because of web address letter case discrepancies. Here are three main filters to consider setting as you move forward with a Google analytics strategy.

Cutting Out Internal Traffic

Every time you and your colleagues navigate throughout your website, it can skew your traffic numbers. Luckily, you can filter these out of your Google Analytics reports, so that you get a more accurate representation of your traffic.

Just head over to your Admin page and select “Filters” under the “View” column. Next, click on “+New Filter” and make sure that the “Create New Filter” bubble is selected.

Name your filter something like “Exclude office traffic” or “Exclude home traffic.” Choose the “Custom Filter” option, then select “IP address” from the dropdown menus.

When you enter the IP address in the Filter pattern field, you’ll need to use backslashes before each dot, according to Google’sregular expressions requirements.   

Excluding Bots and Spiders

It can be extremely frustrating to examine your web traffic data, only to see that certain recurring bots and spiders are accountable to a large chunk of the pie. Luckily, Google istaking proactive measures to protect Analytics users from these annoyances.

You can opt into Google’s automated bot and spider filtering by going to your Admin panel, clicking on “Reporting View Settings” and checking off the box that reads, “Exclude all hits from known bots and spiders.” However, some bots and spiders will still be able to leak through. You can target these individual irritants by creating a new filter, selecting “Custom” and then choosing “Visitor ISP Organization.” Then enter the service provider of the bot using a regular expression.

Keep an eye on your analytics, and be sure to create manual filters for additional bots that attempt to sneak past you. This can prevent bothersome bots and spiders from skewing your website’s data.

Enforcing Lowercase

If visitors enter an URL into their browser or click links that use a mix of uppercase and lowercase characters, then you could wind up with duplicate Google Analytics entries for the same destination. Luckily, you can fix this issue by creating a filter.

Just create a brand new filter and call it something like “Force Lowercase.” Choose “Custom,” click on the “Lowercase” bubble, and select “Request URI.” Once this is done, you should stop seeing multiple entries when browsers load up a page using different letter cases.

Increase the accuracy of your Google Analytics traffic data by using filters to cut through the noise. Don’t allow your metrics to become skewed by your own internal traffic, spiders and bots, or by web addresses that contain a mixture of letter cases.

Larry Alton is an independent business consultant specializing in social media trends, business, and entrepreneurship. Follow him on Twitter and LinkedIn.

 

About Guest Blogger
This post was written by a guest contributor. Please see their details in the post above. If you'd like to guest post for ProBlogger check out our Write for ProBlogger page for details about how YOU can share your tips with our community.
Comments
  1. DrewryNewsNetwork really doesn’t rely on Google Analytics to deliver accurate numbers. Though the service is great to use and does serve great analytical data, there’s nothing that tops having real relationships with people on social networks like Pinterest, Twitter, Facebook, and LinkedIn, because you’re getting real human interaction and feedback on your site content.

  2. Thank you! I didn’t know it was possible to exclude Bots & Spiders from Google Analytics.

    It took a little digging to find it.
    Here is how I found it.

    1. Click on Admin in the header bar
    You should see 3 columns Account | Property | View
    2. Under the View column you’ll see the option View Settings – click on it
    3. Find the check box “Bot Filtering” under the Currency option

    Cheers!
    Rob

    • Rob,

      Never knew there was an option inside Google Analytics dashboard to filter out bots. This is definitely a good thing, because bots have the capability to inflate traffic numbers when the graph shows how many people visited from search engines and social networks. When did you find out this valuable information?

    • Thanks Rob.. i’m kind a blur at first where to find that thing.. tq again..

  3. Thanks Larry for the post.

    Bots and crawlers had driven me nuts forever, and I’m always trying to stay on top of how to get the most accurate data that I can.

    And usually most articles are the same, but this is the first that I seen the uppercase/lowercase problem. Now I’ve got to go through all of my sites and change everything… but it will be worth it!

    Thanks again! :)

  4. Thank you! I just started using Google Analytics for my blog. I feel it still records my views even when I check for it not to track me. Will definitely try these ideas!

  5. There is another huge source of analytics spam has been referral traffic from sites like semalt, buttons-for-website and 7makemoneyonline. For a newer blog with a low readership, referrals from these sites can comprise 50% or more of daily traffic.

    I’m not sure if this is the best way but I removed these by going to Admin, Property, Tracking Info and then Referral Exclusion List and added these domains to remove them from my analytics.

  6. I don’t use google analytics often to track on, but the ‘bot filtering’ option is very helpful to later on when is need to use.

  7. Hi
    What if it is not a known bot? For example, sites like ahref and moz also pull your data. Is there a way to filter out these bots manually?

    Regards
    Neil

  8. Larry, many thanks for sharing this! Really useful tips to get an accurate information on website visitors. This needs some work but very useful in the longer run and we are more realistic about our efforts and objectives.

  9. Hey Larry,

    Thanks for such an informative post full of knowledge and awesomeness. I love the way you write and your dedication towards the topic. And Darren you are doing great job. Thanks for this.

    Have a nice weekend. Hoping for More such great stuffs.

  10. Hi Larry I am currently getting lots of spammy traffic from hulfingtonpost.com/referral, forum.topic51337555.darodar.com/referral and priceg.com/referral and it is really creating junk in my google analytics. Isn’t there any way to prevent such unwanted spammy traffics?

  11. Thanks so much for the advice, this makes it so much easier to navigate. Appreciate the advice very much.

  12. It’s good to see that your blog’s recent posts are about Google analytics. I am learning more and more.

  13. nice tips , i am reading your article from last 3 days very informtaive for noob like me Thanks

Comments are closed for this post.