Dealing with large high traffic websites can sometimes be quite daunting, ensuring that all services are up and functioning as expected and ensuring the response times are acceptable for consumers to not deter them away from your side. Of course you also want to make sure the people visiting your website are real people, are not bots or hackers trying to access or attempt to break your website.
In my world, I’ve found 4 tools that I use to manage the risks involved, and to ensure we react very quickly to what is happening to our environment.
1 – Google Analytics
Using Google Analytics (paid | free) provides immense insight into what is going on your website, helps you identify the bottlenecks and provide you insight into what reach you are getting with your site. With the integration with GoogleWebmaster and Google Adwords it really closes the gaps when it comes to marketing and SEO.
2 – DotCom Monitor
One of my favourite tools currentlly to monitor my suppliers and my own sites. DotCom Monitor provides a great stack of products to ensure you know something is wrong on your website, server or network before your clients do.
Some of their features include
- Web and Network Uptime monitoring, being able to poll your sites, protocols, DNS, web services and others on a preset time basis
- Page Speed monitoring checks page load times in Chrome, Firefox and IE to keep
- Web App Monitoring check the consistency and functionality of your physical web applications. Personally I do not use this, I use Neustar Web Performance tools for web application functionality.
- Real time Alerts via email and sms.
All the above can be monitored from “agents” around the world(AWS servers). Some locations are premium locations like Cape Town in South Africa you have to pay a little bit more.
I specifically also use DotCom monitor to build SLA reports for my data and services providers to check up on their uptime.
3 – Neustar Web Performance ( link )
Love playing with this. Very powerful load testing and web monitoring tool using the Selenium framework to build your test cases. In a nutshell…
Build “test cases” with selenium, upload into Neustar. Bombard you website with x number of conurrent “users” from all over the world and see what are the breaking point of your server, where the bottlenecks are and all this is supplied with nice reports showing failures etc.
You can define how how many users are sent to you website per minute and increase it by increments of your choice to more users. For example Start with 0 users, ramp up to 5 users concurrent for the next 5 minutes, and ramp up to 100 users in the next 10 minutes.
Only downside of their offering is there is no South African based agents, so your response times are much higher, although it will still show you the points of failure relative .
Here is an example of a Response Time report per transaction from a test I ran with 300 users.
4 – ThreatMetrix Device Identification
This is something I’ve only started playing with recently but there is so much potential. ThreatMetrix has a whole stack of products specializing in Cybercrime protection. I however am only working with their Device Identification product.
Their Device Identification does not only look at IP and Cookies, but takes a electronic fingerprint of your connection and browser details like size of browser, “Cookies Enabled”, “HTML5 storage enabled”, Proxy details, Geo location, Real Geo location, “Are you connecting from a known botnet ring” and much more. They also have behaviour algorithms that can be utilised.