How To Clean Out Your Website Analytics
April 16th 2018
When it comes to making important decisions about marketing your business online and search engine optimization, data collection and review is one of the biggest pieces that you need to take care of. At SilverServers, we’ve integrated our Content Managmenet System and reporting tools directly into Google Analytics to make sure that all of our clients can have a featured and detailed window into the performance of their website. Google Analytics is a great place to review the effects of your website’s search engine optimization and marketing avenues. As long as the stats you’re looking at are as clean as possible, you should be able to make some pretty powerful educated decisions about where to spend your online advertisting money and time.
The problem with Google Analytics and other stat counter software is the accuracy of your numbers. Bots spider their way around the web, visiting thousands of pages every second. In the past 5 years, bot traffic has eclipsed real human traffic every time save for 2015. In some recent years, bot traffic hit websites on a 2-1 ratio to real people. That’s an alarming thing to think about if you’re using unfiltered stats to make decisions about your website.
Even if we look at the section of traffic that is all bots, the ‘bad’ bots make up most of those visits. Many websites and platforms have useful bots that gather information, monitor uptime and facilitate data sharing between the web and phone apps for example. The large majority though are nefarious in nature and offer no benefit to your site or the web en masse.
How To Recognize Bot Traffic
Regardless of if these bots are good or bad, they still don’t represent a real marketable visit to your website. Both sides of the robot battle need to be gone from your Analytics if you want any real chance at understanding your users. Some signs that bots are affecting your stats can include:
- -High bounce rates combined with 0 seconds spent on site
- -Multiple repeat visits/sessions from the same user without much activity
- -Traffic sourced from a country or area not applicable to your business
- -Sharp increase in new sessions/visitors
- -Domain names and campaign source names that are not recognizable or verifiable
Google Analytics provides a few ways to clean these stats out for reviewing. To begin with, the first thing you should do for the health of your Analytics is leave your default view entirely alone. Make yourself a new website stats view, and check Google’s provided ‘exclude bots and known spammers from these stats’. SilverServers has the unique position of taking on existing Analytics accounts in various states of repair, but we see this issue quite often. Website Analytics accounts that have been running for years on the default view with no bot/spam removal. Making a new view and using Google’s built in bot exclusion rules can take care of a lot of the problem.
Cleaning Out Google Analytics
Whenever you’re working in your stats, SilverServers suggests always making a new view for your Analtyics property. Sometimes this can be as simple as excluding desktop visitors to one view and mobile to another, allowing you to really focus in on the actions that each of those groups is taking without seeing it combined. Views also let you test and play around with filters and segments without actually interfering with the stat collection that Google is performing. Filtering someone’s IP address from their own stats for example is usually something you should test in a separate view to begin with, to make sure the filter works properly and that the IP address you entered is actually the one that needs filtering (typos happen!).
Filters are one of the most powerful pieces of the stat collection cleaning process. A filter stops the stat/visit from even entering into your Analytics view. This functions like a locked door to your website stats, entirely stopping visitors or specific bots/botnets from sending any sort of information to your website. It works great when you have reliable information on what to filter, but can be dangerous if not implemented properly. Many a website has had real quality traffic accidentally filtered away into oblivion by accident. Another reason to add more views! If you accidentally filter some real traffic, it shouldn’t affect your main untouched stat view, just the one you apply it to! Filters can do a lot of the work, but they need to be maintained in an ongoing basis.
Segments are a more temporary change to just the way your stats are viewed. They are very useful for drilling down into your stats with demographic segments removed or referral sources focused on, giving you a more focused detailed set of information for a specific use. Segments can be used as well to clean out bot traffic, but they let the traffic in to begin with. If filters are the door to your statistical house, segments are interchangeable windows that let you refine the way you see inside. They can still be effective for making sure your stats are clean, but most often are used to test filter settings or to give dynamic interpretation ability.
After you’ve got your filters and segments figured out, you’re all done right? Nope! That’s the fun of the internet! New bots, using new domains or new IP addresses will continue to hit your website’s stats with fake information. Often the larger, more nefarious bot systems have the ability to alter their domain/campaign source/targets and other identifying information. This means that your filters and segments will likely need consistent updates and checks if your stats still show signs of spam on your next check up.
Using Other, Non-Google Analytics Software?
Google Analytics is not the only stat counting/collection service you can have on your website. If you’re using another solution, the spam fight doesn’t change. If anything it often gets harder to manage. Many other stat collecting solutions don’t provide the filter/segment functionality in the same way that Google Analytics does. Some of them rely on htaccess/server based filtering, which is often easily defeatable. If you’re looking through your stats and see a ton of visits with lower time on site and/or high bounce rate combined with a low amount of users, you’re likely seeing bot traffic. To make any decisions, or even to just compare how your website is performing, those stats need to be cleaned out.
As part of our GrassRoots SEO plan, SilverServers cleans out your Google Analytics stats every month to make sure they are as real and concrete as possible. Each month we can help make informed and solid decisions about what users are doing and how they’re doing it. The web is already full of hurdles and hoops when it comes to marketing your business, let SilverServers help make sure your marketing decisions are informed by real people and not robots. Contact us today!
Read more info about how bot traffic can affect your ability to interpret your data.