Known Vulnerabilities
Last updated
Last updated
This section maps to Requirement 14.2-K
Given the protection mechanisms we use to prevent unauthorized software from running on VxSuite components, the simplest way to perform a vulnerability scan is to use Nessus in a host operating system, with VxSuite components running as guest virtual machines (VMs). We turn off our networking blocks on those VMs for the sake of performing a useful vulnerability scan.
On each of these components, the same vulnerabilities are found by Nessus. There are 15, and they are all in the "Info" category of vulnerability, meaning that they only allow for information gathering about the host, not for any corruption or direct attack. All of these issues are listed here, with an explanation as to why they are not vulnerabilities of concern.
Vuln # | Description | Explanation/Mitigation |
---|---|---|
ICMP Timestamp Request Remote Date Disclosure
With networking protections in place like they are on a production machine, this issue would not exist. Even if it did, it only allows an attacker to collect the current clock of the machine, which is not a problem.
Common Platform Enumeration (CPE)
This only indicates that a particular Nessus plugin is running to determine platform properties.
Device Type
With networking protections in place like they are on a production machine, this issue would not exist. Even if it did, it only allows an attacker to determine what kind of devices this is (manufacturer, CPU power, etc.). This is considered public information, so is not a threat.
Ethernet MAC Addresses
With networking protections in place like they are on a production machine, this issue would not exist. Even if it did, it only allows an attacker to determine the MAC address of an Ethernet port, which isn't a threat.
HTTP Methods Allowed (per directory)
With networking protections in place like they are on a production machine, this issue would not exist. VxSuite uses HTTP internally within each component, so with networking protections turned off, it is obvious that allowed HTTP methods can be detected. This isn't an issue on a production system.
HyperText Transfer Protocol (HTTP) Information
With networking protections in place like they are on a production machine, this issue would not exist. Even if an attacker accessed this information, it isn't confidential.
KVM / QEMU Guest Detection (uncredentialed check)
This finding isn't relevant on a production machine, which is not a VM. This is a VM only because of the vuln-scan testing setup.
Nessus SYN scanner
With networking protections in place like they are on a production machine, this issue would not exist.
Nessus Scan Information
This only indicates that a particular Nessus plugin is running to determine running services.
OS Identification
With networking protections in place like they are on a production machine, this issue would not exist. Even if it did, it only allows an attacker to determine the operating system running on a VxSuite component, which is already public information.
Service Detection
With networking protections in place like they are on a production machine, this issue would not exist. Even if it did, the services that are running on a VxSuite component are publicly documented and it is not considered a threat for an attacker to know them.
TCP/IP Timestamps Supported
With networking protections in place like they are on a production machine, this issue would not exist. Even if it did, this only allows an attacker to determine the uptime of the VxSuite component, which is not confidential information.
Traceroute Information
With networking protections in place like they are on a production machine, this issue would not exist.
Web Server No 404 Error Code Check
The web server in question is used only for internal purposes of the VxSuite component, it is never presented as a public web site. Thus, whether it returns a 404 code or not is not importnat.
Web Server robots.txt Information Disclosure
The web server in question is used only for internal purposes of the VxSuite component, so the robots.txt file is irrelevant.