I thought I’d pen a short post on some free URL scanning services that can helpful for the occasional malware scan or starting point for initial investigative work.
There are numerous services out there that provide information on URLs and domains. The majority of sites provide reputational or historical review data, for instance if a domain has been associated with malicious activity or spamming.
A smaller number provide free content scanning, which is the focus for this blog post. This contrasts with purely reputational and threat intelligence focused sites, which interrogate blacklists and other information sources to assess the reputation of a particular domain.
Content scanning is typically offered as part of broader contracted service, such as a continuous monitoring package. SiteLock is one example, Sucuri is another. Such packages tend to provide a great deal of value added capability, including security alerts, malware removal/remediation, server side scanning, file integrity, and more.
If you have a site you are responsible for, it might be a sensible idea to set up a continuous monitoring package for peace of mind. Manually checking using the sites listed in this blog post, with a view to it being an enduring solution, is likely to be insufficient. But, for incident-driven or occasional use, using URL content checkers at least provides indicators that can be followed up on.
Sucuri offer a free scanner, and a more comprehensive paid for service. It’s a good design, with a logical layout. Platform identification is provided, including host language version number if available. A simple and easy to understand rating scale for the security level of the site is provided. Active scanning includes scanning for malware, injected spam, defacement, and internal server errors. SiteCheck checks nine website blacklist databases, both for email (spam) and web content.
SiteCheck does not spider the domain comprehensively, so the most valuable features are limited. The number of URLs checked is low, and as a result the coverage is sparse.
Overall, it’s a good service that provides actionable security intelligence.
One of the better known active scanning sites, and more or less the benchmark for content scanning of URLs. VirusTotal will check a site currently against 67 AV engines and reputation sources, offering an overall score. Because of its popularity, you can expect the data to be fairly recent when submitting an URL to VirusTotal.
What VirusTotal does not provide is any way of identifying a false positive, or combining results in some way to cater for false positives. In addition, it does not provide clear information on when checks are made for third-party AV engines.
VirusTotal is the clear leader in this space and the “goto” site for website malware checking.
This site provides a web reputation score, categorisation, and reputation influences. It also provides an indicator of infections in the last twelve months for the URL. It’s fairly basic interface, but does provide indicators of malicious content over a long time frame.
BrightCloud is a good secondary perspective, including information on factors that contribute to its reputation score.
One of the more detailed online URL checkers, albeit off passive data sources and therefore not strictly content checking in nature, the Talos portal overview page provides information on geo-location, owner details, reputation data (if known), email volume, and whether the site is listed on the Cisco blacklists. Additional information is also available.
What’s missing from the Talos presentation is some form of categorisation, reputation influences, and malware reporting. In fact, there is no confirmation of malware or otherwise on the dashboard.
Comprehensive threat intelligence platforms such as Talos and XForce Exchange are unlikely to be ignored in any serious investigation, and they offer considerable value. However for occasional reviews of site security status, you’d probably find VirusTotal the better option.
Quttera provides a realtime scan of the specified URL or domain. It processes requests using a queue, so responses are not immediate. The coverage is comprehensive and the tool will scan a wide range of URLs for a site, presumably through spidering. Once scanning is complete, the status “clean” or otherwise is presented. A detailed report is also available.
The detailed report covers the following topics: geo-location, IP, server type, number of malicious files, number of suspicious files, potentially suspicious files, number of clean files, number of external links detected, and whether the site is on any of five blacklists.
It’s a good tool, provides some useful data, although the interface is a bit clunky and the advertising is a distraction.
Zulu gets a positive press in the forums and across the Internet. After submitting a URL, the site will analyse the URL and provide a report. It’s not clear in any of the reporting data that Zulu has performed spidering of the site, so it would be sensible to assume it is limited to specific URLs. The URL is rated using a score, the higher the number resulting in a poorer outcome.
Interestingly, Zulu also checks VirusTotal for malicious content, and a variety of other network characteristics. A Zulu report contains HTTP transaction data, location, web server type, external elements such as hyperlinks, content checks against submissions, park status, hashes, heuristics.
Zulu is a useful tool, especially with VirusTotal integration, however the response times are very slow to undermine its usability to a significant extent.
Lets hope that gets fixed and more use of it can be achieved, as it is a worthy addition to the information security toolset.