Abstract
Based on all of the literature provided in previous labs, performing a penetration test is all about the tools used. Tools represent the basis for making the security professional’s life simpler, and more productive and therefore higher paying. The first issue with just using tools for penetration testing is that they truly are only one half of the overall picture. The value that professionals who use these tools add is in the analysis of the results. The second issue with using tools is that anyone can have access to them, usually without charge. With that in mind, the focus of lab five is on exploiting security vulnerabilities without the use of tools.
In lab five, we will be focusing on taking the documentation provided by NIST to secure a typical Windows Server 2003 installation, and using that against those that do what it says. By examining the recommendations provided by NIST we will walk though the different ways to exploit a system that does not institute these recommendations in table form considering various aspects along the way.
Literature Review
Performing a vulnerability assessment is no longer just a good idea; it has become a way of life. Whether in response to an actual or perceived incident or as a preventative defense measure network security professionals like David Morgan from ISS Xforce agree that vulnerability assessment is just good practice. According to Elspeth Wales in Vulnerability Assessment Tools, complacency about the security state of a network is a recipe for disaster (Wales, 2003). In that article they discuss that while the assessment is important it is just one piece of the puzzle. However the lack of any research question, or methods makes the stance of the author seem somewhat weak. The article does make a few good points, one of which is without the added input of a information security professional the reports that are generated the suite of automated tools available for penetration testing are useless (Wales, 2003). To that end in Topological Analysis of network attack vulnerability (Jajodia, Noel, & O’Berry, 2005) the authors present a value add to NESSUS that may solve that very issue. Their research shows that they run a NESSUS scan on a particular network and then analyzing the vulnerability data against a known exploit database through a custom analysis engine. Once that data is captured a detailed report of attack vectors is generated. That report shows both highly possible direct attacks and indirect attacks involving all of the systems of a network (Jajodia, Noel, & O’Berry, 2005). This report discards impossible and highly unlikely vulnerabilities through data analysis in their custom engine, and only lists vulnerabilities that are exploitable on the systems scanned. They give an example of attacking an FTP server named ‘ned’ that does not have FTP open to the Internet by exploiting a vulnerability in an IIS server named ‘maude’ and then attacking the ‘ned’ using ‘maude’ as more of a gateway or relay box. Their uses list improved IDS tuning, and better vulnerability assessment results (Jajodia, Noel, & O’Berry, 2005).
Wales also makes the point that according to Martin Finch of Commissum, performing a vulnerability assessment needs to be part of a risk assessment rather than just an IT audit. The risk assessment needs to focus on software applications as well as hardware and network infrastructure (Wales, 2003). Both Du & Mathur in Vulnerability Testing of Software System Using Fault Injection (Du & Mathur, 1998) and Jorgensen in Testing with Hostile Data Streams (Jorgensen, 2003) present solutions to the software assessment problem. Du & Mathur present a fault injection model to test applications for possible security issues. By taking input that generally considered ideal, modifying or corrupting it, presenting that data to an application and observing the corresponding environment the application operates in at output they make a determination as to is the application is secure or not. If the environment the application operates is perturbed as they put it then that application is considered to have a possible security flaw. If that environment is no perturbed the application is deemed secure (Du & Mathur, 1998). Du & Mathur’s methods are based on an experimental study of their ideas in a controlled environment. This concept will be used directly solve at least the last portion of the lab. Their research does seem to be complete with supporting data from their experiments and conclusions seem logical based on that data. In Testing with Hostile Data Steams, at first Jorgenson seems to disagree with Du & Mathur in that his methods of testing a software application’s response to corrupt and possibly hostile data streams is more effective than fault injection (Jorgensen, 2003). However further into the article we cannot help but see direct correlations between what Du & Mathur have accomplished and what Jorgensen has accomplished. According to Jorgensen, his approach is taking valid data in the form of files, corrupting that data with the addition of random information, and observing that data in the corresponding application. The results of that observation, namely through application function, give him the data he needs as to application security. If the application fails in some way it is deemed insecure otherwise it is not (Jorgensen, 2003). Jorgensen states that fault injection is not as effective as corrupt data streams because it is totally random. However we think that corrupt data streams are also mostly random. This makes us question the Jorgenson’s stance and approach. His methods are based on experiment and his conclusions seem to be drawn from his observations. We also question his research as it is focused on just Adobe Reader and the injection of corrupt data to PDF files. This seems like an extremely narrow test case especially when he states that his ideas are based around stenography (Jorgensen, 2003). Stenography type vulnerability has been more pervasive in GIF and JPEG then in PDF. This appears to be an omission by Jorgensen, or possibly even an error. These black box based application testing models are none the less an option for application vulnerability testing.
Wales goes on to state that according to Jay Heiser of Trusecure one of the major problems with not hiring out to a professional penetration testing firm, and doing the analysis in house is that is has the potential to bring down a production network (Wales, 2003). The tone of this idea as presented by Wales seems to be that in house IT staff are generally not trained well enough to perform an in depth penetration test and audit. They seem to say that there is more than one tool out there, and multiple tools are going to list multiple vulnerabilities. The professional needs to do the leg work and no in house staff really can do that. We disagree with that idea which is why we are taking this class, but understand in house and external audits are the most complete approach to take (Wales, 2003). J. Aycock and K. Barker also seem to have a solution to the under trained in house staff problem in their article titled Viruses 101 (Aycock & Barker, 2005). The authors of Viruses 101 give an overview of a class at the University of Calgary looking at in depth security defense by teaching students how to think like the writers of viruses and malware (Aycock & Barker, 2005). They list requirements to get into the class such as legal and ethical requirements as well as student standing and a written essay. They also describe the course syllabus, objectives, assignments and deliverables. The course itself seems to be an introductory course for higher level students into thinking like a “hacker” (Aycock & Barker, 2005). The interesting part of the article was the amount of negative press received by the university by anti-virus vendors, making us question if vendors like Symantec are really out to protect those who purchase their product. The course seems to slightly follow the course we are currently taking in the areas of student interaction, and peer pressure (Aycock & Barker, 2005). Aycock, and Barker show a well considered report that could be beneficial for anyone looking to start up a program like theirs.
Methods
As with all the previous labs, and the two remaining ones, the first step in completing the process was a literature review. By reading and understanding the literature provided by the instructor and performing a synthesis of that information to our own benefit we hope to gain a better understanding of the topics presented. Our strategy for lab five is to follow the requirements of the lab design document and course syllabus to present a complete lab report. Based on our understanding there are six steps to the completion of the lab, outside of the literature review. The first step is to get approval to use the Windows 2003 Checklist V6.1.12 as provided by the National Institute of Standards and Technology or NIST (National Institute of Standards and Technology, 2009). The second step is, based on that document, identify what each change recommended is accomplishing. The third step is to define where that recommended change falls within the realm of the OSI seven layer model. The fourth step was to define where that change falls within the realm of the McCumber cube. The fifth step is to consider what changes when large scale security patches come into play, how they affect the analysis we are performing, and is there any way to gain additional information on what the patch is correcting. The final step is to test our results against our live lab systems. We propose that the majority of our technique will result in the creation of a table.
Findings
Table one lists the findings of the research provided by team two, in completion of the first four steps of the lab outside of the literature review. Table one provides the results of comparing the security recommendations of NIST as they pertain to Windows Server 2003 (National Institute of Standards and Technology, 2009). An exploit is also listed explaining how each entry could be used against someone who does not take the recommendations of NIST to heart. Our analysis did omit three possible security configuration recommendations. The omitted results were based on security and group policy templates that were specifically designed by NIST to work within with the confines of the US governments Active Directory forest(s). These did not seem to apply to organizations outside the government.
Team two also took a sampling of the security checks provided by NIST. That sampling was used to test how a system that does not implement the recommendations of NIST is vulnerable to exploits without tools. That list included server message block client packet signing. If that setting is disabled then any system other than domain connected Windows 2000 or greater clients can connect to an SMB share. This includes Apple machines, and Linux machines, since backtrack runs on Linux this seems like an important setting to have enabled. Another recommendation of installing anti-virus software is obviously important to the overall security of a system. Insuring that all volumes on a system are NTFS insures that permissions can be assigned to all information on those volumes, which includes removing any anonymous permission. Another selected setting includes disabling the default admin shares on the system, which prevents unrestricted access to the file system even if passwords are guessed. NIST also recommends strong password policies again something that seems overly obvious, especially to information security experts. After reviewing a mall sampling of security checks in table one, it became overly obvious that most of these checks while over looked by most, are generally common sense to increase the security of information assets. This is apparent because the first check is server physical security. This is an obvious enhancement to system security.
Issues
Team two experienced only one issue with lab five. That issue was with the seventh requirement of the lab as per the lab design document. That requirement is: Using your previous work take the list of issues, and find tools that will be associated with each issue. Create a table or matrix. That requirement seemed to be outside the scope of the lab as lab five deals in exploits WITHOUT tools. It also seems there is no tool to exploit something like server physical security. If a server is not physically secured, then by definition no tool is required to exploit that lack of physical insecurity.
Conclusion
The table of results for lab five was something of an eye opener. Team two had never considered taking an information security document and using against those who did not perform the actions it recommended. The actions recommended by NIST are very rigorous, but will harden a Windows Server 2003 machine against most attacks, especially ones that use “script kiddy” level tools. The results provided will no doubt be useful now and in the future for both team two and anyone else who cares to look into the topic of performing an information security audit or penetration test without any tools.
Tables & Figures
Table One:
Checklist Item |
Description |
OSI Layer |
McCumber Coordinate |
Exploit |
|
Physical Security |
Verifies, by observation, that |
0 |
Integrity |
Modifying unsecured servers |
|
Users with Administrative Privileges |
verifies that each user with administrative |
8 |
Integrity |
Accessing system with single admin account |
|
Backup Administrator Account |
verifies that a backup administrator account |
7 |
Confidentiality |
Using Backup Account to access system |
|
Administrator Account Password Changes |
Verifies that the passwords for the default and backup administrator |
8 |
Confidentiality |
Learning default admin password |
|
Shared Accounts |
Verifies that all shared accounts on the |
8 |
Confidentiality |
Anonymous access with shared account |
|
Access to Windows Event Logs |
Verifies that access to the Windows Event Logs |
7 |
Integrity |
Deleting Windows Event Log |
|
Reviewing Audit Logs |
Verifies that Audit logs are reviewed |
7 |
Integrity |
Deleting Audit Logs |
|
Archiving Audit Logs |
verifies that Audit logs are archived to ensure data is not being lost |
7 |
Integrity |
Modifying unsecured Audit Logs |
|
System Recovery Backups |
verifies that System information backups are maintained |
7 |
Confidentiality |
Modifying Unsecure System Backups |
|
Security Configuration Tools |
verifies that the site has a process for implementing security configurations on a system |
7 |
Confidentiality |
Learn current configurations |
|
System Configuration Changes |
The site does not use a tool to compare system files (*.exe, *.bat, *.com, *.cmd and *.dll) on servers against a baseline, on a weekly basis, then this is a finding. |
7 |
Integrity |
Execute custom system files |
|
Unencrypted Remote Access |
Check applies to machines whose services are accessed remotely. (E.g. FTP, Telnet, etc.) |
7 |
Confidentiality |
Anonymous access remotely |
|
Intrusion Detection |
a Server does not have a host-based intrusion detection (HID) system installed and enabled, then this is a finding |
7 |
Confidentiality |
Anonymous system access |
|
PKI Authentication Required |
verifies that the system is configured to require the use of a CAC, PIV compliant hardware token or Alternate Logon Token (ALT) for authentication |
5 |
Integrity |
Break authentication |
|
Service Packs |
verifies that the most-current service pack for Windows Server 2003, 128 bit version is installed |
7 |
Integrity |
Using known working exploits that are fixed with new service packs |
|
Strong Password Filtering |
determines whether the site has implemented a password filter that enforces the DoD requirements |
7 |
Confidentiality |
Weak passwords are easier to crack |
|
Software Certificate Installation Files |
verifies that software certificate installation files have been removed from a system |
7 |
Integrity |
Modifying Installation Files |
|
Local NTFS Volumes |
verifies that all local drives are configured using the NTFS format, enabling the use of Windows Server 2003’s security and auditing features |
7 |
Confidentiality |
Anonymously accessing local volumes |
|
File Shares |
verifies that user-created file shares drives are configured properly |
7 |
Integrity |
Access shared drives |
|
Installed Services |
verifies that prohibited services are not activated |
7 |
Integrity |
Install prohibited services |
|
Unnecessary Services |
verifies any services listed for which the site has documented exceptions are also permitted |
7 |
Integrity |
Start prohibited services to gain access |
|
Virus-Protection Software |
verifies that a virus-protection program approved by DOD-CERT is installed and activated on the Windows Server 2003 system |
7 |
Integrity |
Install Virus |
|
Printer Share Permissions |
check verifies that shared printers have properly configured share permissions |
6 |
Integrity |
Modify Printer Shares |
|
Booting into Multiple Operating Systems |
verifies that the local system boots directly into Windows Server 2003 |
7 |
Confidentiality |
Boot into different OS |
|
Password Policy Configuration |
verifies that the system’s password policy conforms to DISA standards |
7 |
Integrity |
Discover user password |
|
Password Uniqueness |
Enforce password history is less than 24 passwords |
7 |
Integrity |
Password would remain the same |
|
Maximum Password Age |
Maximum password age is greater than 60 days |
7 |
Integrity |
Have access for long time |
|
Minimum Password Age |
Minimum password age is less than one day |
7 |
Integrity |
Discover user password |
|
Minimum Password Length |
Minimum password length is less than 14 |
7 |
Integrity |
Discover user password |
|
Enable Strong Password Filtering |
verifies that Windows 2003 is implementing a minimum level of strong password filtering |
7 |
Integrity |
Small changes to previous password, easier to guess |
|
Disable Reversible Password Encryption |
verifies that Windows Server 2003 is configured to prevent passwords being stored using a two-way hash |
7 |
Integrity |
Discover user password |
|
Account Lockout Configuration |
verifies that the system’s account lockout policy conforms to DISA standards |
7 |
Integrity |
Continued attempts to discover password |
|
Lockout Duration |
requiring an administrator to unlock the account |
7 |
Confidentiality |
Become Admin to unlock account |
|
Bad Logon Attempts |
Account lockout threshold” is “0” or more than three attempts |
7 |
Integrity |
Continued attempts to discover password |
|
Bad Logon Counter Reset |
Reset account lockout counter after value is less than 60 minutes |
7 |
Integrity |
Allows for continued attempts |
|
Kerberos Policy |
verifies that the Kerberos authentication settings are configured to the minimum required DISA standards |
7 |
Integrity |
Break authentication |
|
Kerberos – User Logon Restrictions |
verifies that the Kerberos Key Distribution Center (KDC) validates every request for a session ticket against the user rights policy of the target computer |
7 |
Integrity |
Anonymous login to server |
|
Kerberos – Service Ticket Lifetime |
verifies that the maximum amount of time (in minutes) that a granted session ticket can be used to access a particular service, meets DISA standards |
5 |
Integrity |
Session to last a lifetime |
|
Kerberos – User Ticket Lifetime |
verifies that the maximum amount of time (in hours) that a user’s ticket-granting ticket (TGT) may be used, meets DISA standards |
7 |
Integrity |
Users ticket to last a lifetime |
|
Kerberos – User Ticket Renewal Lifetime |
verifies that the period of time (in days) during which a user’s ticket-granting ticket (TGT) may be renewed, meets DISA standards |
7 |
Integrity |
Denies users renewal |
|
Kerberos – Computer Clock Synchronization |
verifies that the maximum time difference (in minutes) that Kerberos will tolerate between the time on a client’s clock and the time on a server’s clock, while still considering the two clocks synchronous, meets DISA standards |
7 |
Integrity |
Change system time |
|
Audit Policy Configuration |
verifies that the minimum user account and object auditing on the local system is configured to DISA standards |
7 |
Confidentiality |
Modify audit policy |
|
Auditing Configuration |
system does not audit the events listed |
7 |
Confidentiality |
Modify audit configuration |
|
User Rights Policy Configuration |
verifies that the system’s user rights and advanced user rights policies are configured in accordance with DISA requirements |
7 |
Integrity |
Modify users and advance users permission |
|
User Rights Assignments |
Some applications require one or more of these rights to function |
7 |
Confidentiality |
Software having full rights to function |
|
Users Granted “Act as part of the operating system” Privilege |
users and user groups that are assigned this right can bypass all security protective mechanisms that apply to all users, including administrators |
7 |
Confidentiality |
Running files as systems |
|
User Right “Debug programs” |
it provides access to the kernel with complete access to sensitive and critical operating system components |
7 |
Integrity |
Modify the kernel |
|
Accounts/Groups not given “Deny access to this computer from network” Privilege |
right to log on to the computer from the network can give a user access to information that can be used to exploit the system |
7 |
Integrity |
Access multiple computers |
|
Security Options Configuration |
verifies that security options on the local system are configured to DISA standards |
7 |
Integrity |
Modify security options |
|
Disable Guest Account |
that Windows Server 2003 is configured to disable the built-in guest account |
7 |
Integrity |
Gain access with Guest account |
|
Limit Blank Passwords |
verifies that Windows Server 2003 is configured to limit the use of blank passwords to local console logon only |
7 |
Confidentiality |
Instant access |
|
Built-in Administrator Account Renamed |
verifies that the built in Administrator account has been renamed |
7 |
Integrity |
Username is already known |
|
Built-in Guest Account Renamed |
the built in guest account has been renamed |
7 |
Integrity |
Username guest is known |
|
Halt on Audit Failure |
verifies that the site has a documented policy and provable procedures in place to identify, in a timely manner, that a system has stopped writing to the Event logs |
7 |
Integrity |
Send false positive |
|
Undock Without Logging On |
verifies that Windows Server 2003 is configured to require logon for undocking a machine |
7 |
Integrity |
undock machine without log in |
|
Format and Eject Removable Media |
verifies that Windows Server 2003 is configured to only allow Administrators to format and eject removable media |
7 |
Integrity |
Format local media |
|
Secure Print Driver Installation |
verifies that Windows Server 2003 is configured to allow only members of the “Administrators” and “Power Users” user groups to install printer drivers |
7 |
Integrity |
Install custom printer driver |
|
Unsigned Driver Installation Behavior |
verifies that the unsigned driver behavior is set to “Warn but allow installation” (recommended setting) or “Do not allow installation |
7 |
Integrity |
Allows for any installation |
|
Task Scheduling – Server Operators Group |
verifies that the Server Operators group is prevented from using the Task Scheduler Service (AT command) to schedule a task to automatically run |
7 |
Integrity |
Automatically run jobs |
|
LDAP Signing Requirements |
verifies that the Lightweight Directory Access Protocol (LDAP) server requires LDAP clients to negotiate data signing |
7 |
Confidentiality |
Clients negotiate without data signing |
|
Computer Account Password Change Requests |
verifies that requests to change the computer account password are not refused by the Domain Controller |
7 |
Confidentiality |
Refuse computer account password change |
|
Encrypting and Signing of Secure Channel Traffic |
verifies that the computer will always encrypt or sign secure channel data |
7 |
Integrity |
decrypt channel data |
|
Encryption of Secure Channel Traffic |
verifies that the computer will always digitally encrypt secure channel data when possible |
7 |
Integrity |
spoof secure channel |
|
Signing of Secure Channel Traffic |
verifies that the computer will always sign secure channel data when possible |
7 |
Integrity |
spoof secure channel |
|
Resetting Computer Account Password |
verifies that the computer account password is not prevented from being reset every week |
7 |
Integrity |
Capture password during reset |
|
Maximum Machine Account Password Age |
Verifies that the computer account password is changed, at a maximum, every 30 days. |
7 |
Confidentiality |
Password would remain the same |
|
Strong Session Key |
verifies that the computer is configured to require a strong session key |
7 |
Confidentiality |
disable session keys |
|
Display of Last User Name |
verifies that the system is configured to prevent the display of the last user name on the logon screen |
7 |
Integrity |
Knowledge of usernames |
|
Ctrl+Alt+Del Security Attention Sequence |
verifies that the Ctrl+Alt+Del security attention sequence is enabled |
7 |
Integrity |
Modify security attention |
|
Display Legal Notice |
verifies that Windows is configured to display a legal notice prior to logging on |
7 |
Confidentiality |
Remove Legal Notice |
|
Disable Caching of Logon Credentials |
verifies that Windows Server 2003 is configured to limit copies of user profiles saved during interactive logon |
7 |
Confidentiality |
Use caching to steal login credentials |
|
Password Expiration Warning |
verifies that Windows Server 2003 is configured to warn users in advance when their passwords will expire |
7 |
Integrity |
Spoofs password expiration |
|
Domain Controller Authentication to Unlock Workstation |
check verifies that Windows Server 2003 is configured to require the system to pass the credentials to the domain controller (if in a domain) for authentication, before allowing the system to be unlocked |
7 |
Confidentiality |
unlock without domain controller |
|
Smart Card Removal Option |
verifies that the Smart Card removal option is set to Lock Workstation (minimum requirement) or Force Logoff |
7 |
Integrity |
Force logoff without smart card removal |
|
SMB Client Packet Signing (Always) |
verifies that the SMB Client policy is set to SMB packet signing |
7 |
Integrity |
Modify packet signing to never |
|
SMB Client Packet Signing (if Server agrees) |
verifies that the SMB Client policy is set to SMB packet signing when possible |
7 |
Integrity |
Server set to always agree |
|
Unencrypted Passwords to 3rd Party SMB Servers |
verifies that the computer will not send clear-text passwords to non-Microsoft SMB servers which do not support password encryption during authentication |
7 |
Integrity |
View passwords in clear text |
|
Idle Time Before Suspending a Session |
verifies the amount of continuous idle time that must pass in a Server Message Block (SMB) session before the session is disconnected due to inactivity |
7 |
Integrity |
Shorten time to suspend a session |
|
SMB Server Packet Signing (Always) |
verifies that the SMB Server policy is set to SMB packet signing |
7 |
Integrity |
Modify packet signing to never |
|
SMB Server Packet Signing (if client agrees) |
verifies that the SMB Server policy is set to SMB packet signing when possible |
7 |
Integrity |
Server set to always agree |
|
Forcibly Disconnect when Logon Hours Expire |
verifies Windows Server 2003 is configured that, if a user has restricted hours, this setting is enabled so the server will disconnect the user when the user’s logon hours expire |
7 |
Integrity |
Modify Logon Hours |
|
Disable Administrator Automatic Logon |
verifies that Windows Server 2003 is configured to prevent the automatic logon of the Administrator account and does not save a default password |
7 |
Integrity |
Automatically login with Admin rights |
|
IP Source Routing |
verifies Windows Server 2003 is configured to protect against packet spoofing |
3 |
Integrity |
Spoof packets |
|
Enable Not Saving of Dial-up Password |
verifies that Windows is configured to prevent a dial?up/VPN password from being saved between sessions |
7 |
Integrity |
Using saved password to access via dial-up/VPN |
|
Detection of Dead Gateways |
verifies Windows Server 2003 is configured to disable dead gateway detection |
3 |
Integrity |
Slow network performance |
|
ICMP Redirects |
verifies Windows Server 2003 is configured to disable ICMP redirects |
3 |
Integrity |
Redirect packets to unknown locations |
|
TCP Keep Alive Time |
verifies that Windows Server 2003 is configured to control how often TCP attempts to verify that an idle connection is still intact by sending keep-alive a packet |
3 |
Integrity |
Shorten the keep-alive packets |
|
Disable Media Auto play |
verifies Windows is configured to turn off the Autorun feature on all drives |
7 |
Integrity |
Autorun rootkit |
|
NetBIOS Name Release |
verifies Windows Server 2003 is configured to prevent release of its NetBIOS name when a name-release request is received |
5 |
Integrity |
Gain multiple computers names |
|
Router Discovery |
verifies Windows Server 2003 is configured to disable the Internet Router Discovery Protocol (IRDP) |
3 |
Integrity |
Use server to flood the network with discover packets |
|
Safe DLL Search Mode |
verifies that Windows Server 2003 is configured to search the %Systemroot% for the DLL before searching the current directory or the rest of the path |
7 |
Integrity |
Modify %Systemroot% |
|
Screen Saver Grace Period |
verifies that Windows is configured to have password protection take effect within a limited time frame when the screen saver becomes active |
7 |
Integrity |
Password not required after screensaver |
|
Syn Attack Protection Level |
verifies Windows Server 2003 is configured to protect against Syn attacks |
3 |
Integrity |
Attack server with Syn packets |
|
TCP Connection Responses |
verifies Windows Server 2003 is configured to control the maximum number of times that TCP retransmits a SYN before aborting the attempt |
4 |
Integrity |
Forces aborting before TCP retransmits |
|
TCP Data Retransmissions |
verifies Windows Server 2003 is configured to control the maximum number of times that TCP retransmits unacknowledged data segments before aborting the attempt |
4 |
Integrity |
Forces aborting before TCP retransmits unacknowledged data segments |
|
Event Log Warning |
verifies that Windows Server 2003 is configured to generate a warning when the Security Event Log has reached a defined threshold |
7 |
Integrity |
Modify event warning log |
|
Anonymous SID/Name Translation |
verifies Windows Server 2003 is configured to prevent users authenticated as anonymous users from performing SID/Name translation |
6 |
Integrity |
Connect to anonymous servers |
|
Restrict Anonymous Network Shares |
verifies that Windows Server 2003 is configured to prohibit anonymous logon users (also known as “null” session connections) from listing account names and enumerating share names |
7 |
Integrity |
Access anonymous network shares |
|
Storage of Credentials or .NET Passports |
verifies Windows Server 2003 is configured to prevent storage of authentication credentials or .NET passports |
7 |
Integrity |
Find stored credentials on hosts |
|
Everyone Permissions Apply to Anonymous Users |
verifies Windows Server 2003 is configured to prevent anonymous users from having the same rights and permissions as the built-in Everyone group |
7 |
Integrity |
Access other users with equal rights |
|
Anonymous Access to Named Pipes |
verifies Windows Server 2003 is configured to prevent anonymous access to unauthorized named pipes |
7 |
Integrity |
Using unauthorized pipes |
|
Remotely Accessible Registry Paths |
verifies Windows Server 2003 is configured to prevent access to unauthorized registry paths from a remote computer |
7 |
Integrity |
Modify registry paths |
|
Remotely Accessible Registry Paths and Sub-paths |
verifies Windows Server 2003 is configured to prevent access to unauthorized registry paths and sub-paths from a remote computer |
7 |
Integrity |
Remotely modify registry paths |
|
Anonymous Access to Named Pipes and Shares |
verifies Windows Server 2003 is configured to prevent anonymous access to Named Pipes and shares |
7 |
Integrity |
Modify Named pipes and shares |
|
Anonymous Access to Network Shares |
verifies Windows Server 2003 is configured to prevent anonymous access to unauthorized network shares |
7 |
Integrity |
Modify Network shares |
|
Sharing and Security Model for Local Accounts |
verifies Windows Server 2003 is configured to use the classic network-sharing security model |
7 |
Confidentiality |
Local users modify other local accounts |
|
LAN Manager Hash Value |
verifies Windows Server 2003 is configured to prevent the LAN Manager hash of the password from being stored in the SAM |
4 |
Confidentiality |
Copy hash to have password |
|
Force Logoff when Logon Hours Expire |
verifies Windows Server 2003 is configured to force users to log off when their allowed logon hours expire |
7 |
Integrity |
User accounts are used after work hours |
|
LanMan Authentication Level |
Verifies that Windows is configured to refuse LM authentication. This removes the use of LM challenge/response from the network, preventing many attacks |
4 |
Confidentiality |
Open to several attacks |
|
LDAP Client Signing |
verifies Windows Server 2003 is configured for the minimum required signing requirements for LDAP clients |
5 |
Integrity |
Multiple LDAP client signings |
|
Minimum Session Security for NTLM SSP-based (including secure RPC) Clients |
verifies Windows Server 2003 is configured to meet the requirements for securing RPC sessions |
7 |
Confidentiality |
Hijack RPC clients |
|
Minimum Session Security for NTLM SSP-based (including secure RPC) servers |
verifies Windows Server 2003 is configured to meet the requirements for securing RPC sessions |
7 |
Confidentiality |
Refuse clients sessions |
|
Recovery Console – Automatic Logon |
verifies that the Recovery Console option to allow automatic logon is disabled |
7 |
Integrity |
Automatically login to recovery console |
|
Recovery Console – Set Command |
verifies that the Recovery Console SET command is disabled |
7 |
Integrity |
Modify Recovery Console |
|
Display Shutdown Button |
verifies that Windows Server 2003 is configured to not display the “Shutdown” button in the logon dialog box |
7 |
Integrity |
Power off server |
|
Strong Key Protection |
verifies that the system is configured to prevent users from using private keys without a password |
7 |
Confidentiality |
Access to private key |
|
FIPS compliant Algorithms |
verifies that the system is configured to use algorithms that are FIPS compliant for encryption, hashing, and signing |
7 |
Integrity |
Easier to obtain encryption, hashing, and signing |
|
Objects Created by Members of the Administrators Group |
verifies that the system is configured to set the default owner to the object creator of objects created by the Administrator group |
7 |
Confidentiality |
Modify objects in Admin group |
|
Case Insensitivity for Non-Windows Subsystems |
verifies that the system is configured to require case insensitivity for non-Windows subsystems |
7 |
Confidentiality |
Passwords are not case sensitive, quicker to break |
|
Global System Object Permission Strength |
verifies that the strength of the default discretionary access control list (DACL) for objects is increased |
7 |
Confidentiality |
Modify discretionary access control list |
|
Optional Subsystems |
verifies that additional subsystems are not permitted to run on the system |
7 |
Integrity |
Run custom built subsystem |
|
Software Restriction Policies |
verifies that certificate rules are enforced for a user process that attempts to run software with an .exe file name extension |
7 |
Integrity |
automatically run .exe files with payloads |
|
Event Log Configuration |
verifies that Windows Server 2003 is configured to preserve event data, should the size of the logs reach their maximum |
7 |
Confidentiality |
Modify Event logs |
|
Event Log Sizes |
determines if the event logs have been set to the proper size |
7 |
Confidentiality |
Expand the log size |
|
Restrict Event Log Access Over Network |
verifies that Windows Server 2003 is configured to restrict anonymous network access to the event logs over null-session shares |
7 |
Integrity |
Expand event logs all across the network and it will grow in size |
|
Preserving Security Events |
determines if the retention method for preserving event logs has been configured correctly |
7 |
Integrity |
Modify security events |
|
Service Object Permissions |
verifies that the ACLs for disabled services meet minimum requirements |
7 |
Confidentiality |
Remove ACLs |
|
File and Directory Permissions and Auditing |
verifies that the access-control permissions applied to the file or directory objects conform to DISA standards |
7 |
Confidentiality |
Modify directory permissions |
|
System Files |
NSA has determined that the default ACL settings are adequate when the Security Option “Network access: Let everyone permissions apply to anonymous users” is set to “Disabled” and Power User Group Membership is restricted to no users |
7 |
Integrity |
Everyone has more permissions |
|
Event Logs |
event log files “AppEvent.Evt,” “SecEvent.Evt,” and “SysEvent.Evt” must be protected |
7 |
Confidentiality |
Modify event logs |
|
File and Directory Auditing |
verifies that the minimum auditing configuration is applied to all files and directories in conformance with DISA standards |
7 |
Confidentiality |
Edit Directory configuration |
|
Registry Key Permissions and Auditing |
verify that registry access-control permissions and auditing conform to DISA standards |
7 |
Confidentiality |
Edit Registry keys permissions |
|
Anonymous Access to the Registry |
verifies that the system is protected from anonymous access |
7 |
Integrity |
Unknown edit of registry keys |
|
Registry Key Auditing |
verifies the auditing configuration for all the registry keys contained under the “HKEY_LOCAL_MACHINESoftware” and “HKEY_LOCAL_MACHINESystem” hives |
7 |
Integrity |
Unknown edit of registry keys |
|
Printers – Disallow Installation of Printers Using Kernel-mode Drivers |
SNMP is being used, this check verifies that the system is configured to prevent the installation of kernel-mode print drivers |
7 |
Integrity |
Installation of kernel-mode print drives |
|
Group Policy – Registry Policy Processing |
verifies that the system is configured to insure that Group Policy settings overwrite any unauthorized security policy changes |
7 |
Integrity |
Modify Group policy to give permissions |
|
Group Policy – Turn Off Background Refresh of Group Policy |
verifies that the system is configured to insure that Group Policy settings are refreshed while a user is currently logged on |
7 |
Integrity |
Edit Group policy to give permissions |
|
Error Reporting – Report Errors |
verifies that the system is configured to prevent reporting of errors to Microsoft |
7 |
Integrity |
False Positive error reporting |
|
Logon – Always Wait for the Network at Computer Startup and Logon |
verifies that the system is configured to cause Windows to wait for complete network initialization before allowing the user to log on |
7 |
Integrity |
Can login without network |
|
Remote Assistance – Offer Remote Assistance |
verifies that the system is configured to prevent unsolicited offers of help to this computer |
7 |
Integrity |
Send false offers for remote access |
|
Remote Assistance – Solicited Remote Assistance |
verifies that the system is configured to prevent solicited remote assistance from this computer |
7 |
Confidentiality |
Send false offers for remote access |
|
Windows Time Service – Configure Windows NTP Client |
verifies that the system is configured to synchronize with a secure, authorized time source, and not the Microsoft time server |
7 |
Confidentiality |
Change system time |
|
IE – Disable Automatic Install of Internet Explorer Components |
verifies that the system is configured to prevent the automatic installation of components if it goes to a site that requires components that are not currently installed |
7 |
Integrity |
Install custom IE components with payload |
|
IE – Disable Periodic Check for Internet Explorer Software Updates |
that the system is configured to prevent periodically checking the Microsoft web sites to determine if there are updates to Internet Explorer available |
7 |
Integrity |
Updates can cause system to not function |
|
IE – Security Zones: Do Not Allow Users to Add/Delete Sites |
verifies that the system is configured to prevent users from adding sites to various security zones |
7 |
Integrity |
Add unwanted sites to security zones |
|
IE – Security Zones: Do Not Allow Users to Change Policies |
verifies that the system is configured to prevent users from adding sites to various security zones |
7 |
Confidentiality |
Edit unwanted sites to security zones |
|
IE – Security Zones: Use Only Machine Settings |
verifies that the system enforces consistent security zone settings for all users of the computer |
7 |
Integrity |
Modify security zone settings |
|
NetMeeting: Disable Remote Desktop Sharing |
verifies that Remote Desktop Sharing should be disabled |
7 |
Confidentiality |
View other users desktop |
|
Terminal Services – Limit Number of Connections |
verifies that the system is configured to limit the number of simultaneous connections to the terminal server |
7 |
Confidentiality |
Overload terminal server with requests |
|
Terminal Services – Limit Users to One Remote Session |
verifies that the system is configured to limit users to one remote session |
7 |
Confidentiality |
Overload terminal server with sessions |
|
Terminal Services – Remote Control Settings |
verifies that the system is configured to prevent Remote Control of Terminal Service sessions by another user |
7 |
Integrity |
Hijack remote terminal service session |
|
Terminal Services – Prevent Password Saving |
verifies that the system is configured to prevent Users from saving passwords |
7 |
Integrity |
Using saved passwords |
|
Terminal Services – Set Client Connection Encryption Level |
verifies that the system is configured to require the proper encryption level that is used for the client connection |
7 |
Confidentiality |
Break encryption from client |
|
Terminal Services – Secure Server |
verifies that the Terminal Server is configured to require secure remote procedure call (RPC) communication with clients |
7 |
Confidentiality |
Break encryption from server |
|
Terminal Services – Allow Reconnection from Original Client Only |
verifies that the system is configured to allow only the original client to resume a session |
7 |
Integrity |
Reconnection made by anonymous users |
|
Terminal Services – Set Time Limit for Idle Sessions |
verifies that the system is configured to disconnect idle sessions after no more than 15 minutes |
7 |
Integrity |
Hijack idle sessions |
|
Terminal Services – Set Time Limit for Disconnected Sessions |
verifies that the system is configured to end disconnected sessions after 1 minute |
7 |
Confidentiality |
Always disconnecting |
|
Terminal Services – Terminate Session When Time Limits are Reached |
verifies that the system is configured to forcefully disconnect clients if their terminal services time limit is exceeded |
7 |
Confidentiality |
Refuses terminal sessions |
|
Terminal Services – Do Not Delete Temp Folder upon Exit |
verifies that the system is configured to require the deletion of the temporary folders when the session is terminated |
7 |
Integrity |
Browse temp folder to information |
|
Terminal Services – Do Not Use Temp Folders per Session |
verifies that the system is configured to require per session temporary folders |
7 |
Integrity |
Explore temp folders |
|
Media Player – Disabling Media Player Automatic Updates |
verifies that the system is configured to prevent automatic updates by the Windows Media Player |
7 |
Confidentiality |
Updates can cause system to not function |
|
Windows Messenger – Do Not Allow Windows Messenger to be Run |
verifies that the system is configured to prevent the Windows Messenger client from being run |
7 |
Integrity |
Anonymous IMs for information | |
Windows Messenger – Do Not Automatically Start Windows Messenger Initially |
verifies that the system is configured to prevent the automatic launch of Windows Messenger at user logon |
7 |
Confidentiality |
Anonymous IMs for information |
|
Password Protected Screen Savers |
verifies that a password-protected screen saver is activated for users with a timeout value of 15 minutes or less |
7 |
Confidentiality |
Password not required after screensaver |
|
Media Player – Prevent Codec Download |
verifies that the system is configured to insure that all CODECs are installed by the System Administrator, and not automatically downloaded |
7 |
Confidentiality |
Codecs have hidden payloads |
|
Recycle Bin Configured to Delete Files |
verifies that Windows 2003 Servers have the Recycle Bin configured to delete files |
7 |
Integrity |
Review deleted files |
|
Disallow AutoPlay/AutoRun from Autorun.inf |
registry key will prevent the Autorun.inf from executing |
7 |
Integrity |
Auto run payload |
|
Passwords Requirement |
any accounts listed in the user report have a “No” in the “PswdRequired” column |
7 |
Confidentiality |
No security |
|
Passwords Expiration |
any accounts listed in the user report have a “No” in the “PswdExpires” column |
7 |
Confidentiality |
Password stays the same |
|
Application Account Passwords |
have a local policy to ensure that passwords for application/service accounts are at least 15 characters in length and meet complexity requirements for all passwords |
7 |
Integrity |
Weak passwords are easier to crack |
|
Dormant Accounts |
any enabled accounts have not been logged into within the past 35 days |
7 |
Integrity |
Open to login attacks |
|
Restricted Administrator Group Membership |
an account, without administrator duties, is a member of the Administrators group |
7 |
Confidentiality |
Admin rights |
|
HelpAssistant or Support_388945a0 Accounts Not Disabled |
the HelpAssistant or Support_388945a0 accounts have not been disabled |
7 |
Confidentiality |
Login using known usernames |
|
Users with Backup Operator Privileges |
verifies that any accounts with backup operator privileges have been documented and users with this privilege are assigned a unique account with membership in the “Backup Operators” group, separate from their standard user account |
7 |
Integrity |
Modify Backup Operations |
|
FTP (File Transfer Protocol) Server Configuration |
verifies that the FTP server is configured in accordance with DISA standards |
7 |
Integrity |
Access unsecure file transfer servers |
|
Prohibited FTP Logins Permitted |
Anonymous ftp will not be configured on systems that are inside the protected perimeter |
7 |
Confidentiality |
Anonymous access to file servers |
|
Access to System Drive Permitted |
attempt to access the root of the boot drive |
7 |
Integrity |
Modify root drive |
|
DCOM – Default Authorization Level |
verifies that system default Authorization Level is set to an appropriate level |
7 |
Confidentiality |
modify authorization not required for login |
|
DCOM – Object Registry Permissions |
verifies that a DCOM object doesn’t have access permissions that allow non-administrator users to change the security settings |
7 |
Confidentiality |
Modify security settings |
|
DCOM – RunAs Value |
verifies that DCOM calls are executed under the security context of the calling user |
7 |
Confidentiality |
Execute as admin users |
|
ASP.NET Common Runtime Host (.NET Framework) |
verifies that ASP.NET is not installed on a system |
7 |
Integrity |
install ASP.NET |
|
Weak Passwords (Domain Controllers) |
password strength checking scripts indicates that there are weak passwords on the system |
7 |
Integrity |
Weak passwords are easier to crack |
|
Security-related Software Patches |
verifies that security-related software patches are applied to the system on a timely basis |
7 |
Confidentiality |
Using known working exploits that are fixed with new software patches |
Works Cited
Aycock, J., & Barker, K. (2005). Viruses 101. ACM SIGCSE , 152-156.
Du, W., & Mathur, A. P. (1998). Vulnerability Testing of Software using Fault Injection. West Lafayette: Purdue Univeristy.
Jajodia, S., Noel, S., & O’Berry, B. (2005). Topological Analysis of Network Attack Vulnerability. In V. Kumar, J. Srivastava, & A. Lazarevic, Managing Cyber Threats: Issues, Approaches, and Challenges (pp. 247-266). New York: Springer US.
Jorgensen, A. A. (2003, March 1). Testing with Hostile Data Streams. ACM Software Engineering Notes , 28, pp. 1-6.
National Institute of Standards and Technology. (2009). WINDOWS SERVER 2003 SECURITY CHECKLIST. Arlington: Defense information Systems Agency.
Wales, E. (2003, July 1). Vulnerability Assessment Tools. Network Security , 15-17.
This group’s abstract starts off with a very good introduction to this lab. The beginning of the abstract ties this lab into the last lab and also shows the value of this lab. The second part of the abstract was not written very well. The second part of the abstract tries to explain what is involved in this lab. The last part of the first sentence and the last sentence of the abstract do not make any sense. The group could have reworded these sentences to give a better understanding of what was entailed in this lab. The group’s literature review was almost complete. The only thing that was missing in the literature review was how the articles relate to the current lab. The literature review does mention in the end of the review how one of the articles is similar to this course, but there is no other mention to how these articles tie into the current lab. Over all the literature review was done very well. It covered what each of the articles themes were and talked about a question if the article had one. The review also examined the methodologies, research, and errors in each of the articles. Also the review did a very nice job of tying each of the articles in with each other. The group’s methodology could have been expanded on. The methodology only gives the steps given in the lab and do not explain how they are going to go about accomplishing each step. The group expands on how they will use the literature reviews to aid in accomplishing this lab, but lacks in explaining any other step. The findings section of this group’s lab was also lacking. The first part of the findings talks about leaving out a part of the NIST document that they used, that would not pertain to their system. This section of the findings could have been placed in the methodology of this lab report. The group then gives a list of samples that they used to test their vulnerabilities on. The group left out a couple parts of the lab that should have been covered. The group did not mention any findings they saw when they put the table together, like any patterns in the OSI layers of the changes or in McCumber’s cube. They did not talk at all about the section of the lab that explores security patches applied to the examined operating system. The group did not even include that section in the table or in a table by itself. A lot more could have been included in the findings for this group. In the issues section the group talks about not being able to apply tools to the exploits found in the first part of the lab. The group seems to have missed the whole section on exploring security patches. The part of the lab that the group was having troubles with pertained to finding tools that could be used against the vulnerabilities that the security patches fixed. The group did mention in the abstract that they were going to examine this section but never came back to it. In the conclusion the group mentions that they are surprised in the findings after creating the table. They say they never thought of approaching examination of vulnerabilities in this way. The conclusion could have expanded on some of the patterns they found and how that could relate to how those patterns could be examined to perform a better penetration attack on a system.
The abstract reads more like an introduction to a literature review, not bad but would’ve fit better as an introductory paragraph to your literature review. The first and second paragraphs of the literature review are merely summaries of the assigned readings without any ties to the lab activities or each other for that matter. Interestingly enough, the word “lab” only appears once in the literature review. In the future, tying the literature at hand with the activities of the lab makes for a more interesting and relevant read for the reader. The summaries of the articles aren’t bad in themselves but could be taken a few steps further to tie everything together.
The methodologies section is too brief for the task at hand. Instead of restating the requirements and that you intend to accomplish them, give some detail on how you plan on accomplishing the tasks of the lab. How will you parse the NIST document for vulnerabilities? How will you classify them? How will you select the tools to test the vulnerabilities you discover and where will you select them from? All of these questions should be answered to make the process repeatable. A mention of how the vulnerabilities discovered were going to be entered into the table for the findings would have been helpful too, the output is hard to read.
There was some material in the findings section that could’ve been put in the methodologies section instead. Discussion on omitting the three configurations and the sampling of security checks would’ve added more detail to the methodologies section. The organization of table one for the findings was difficult to follow. Was this the order they were listed in the guide? Would it have been easier to order them by OSI layer? In the methodologies mention was made of considering large scale patching and how they affect the outcome of this lab’s exercises, the findings section doesn’t have a discussion on this, only a one line entry in the table mentioning that the patches be applied in a timely manner. Are there any vulnerabilities that arise from this process? Is patching as soon as possible always the best way to go? Some of the vulnerabilities discovered raise questions as well. One of the vulnerabilities or check list items from the document was to ensure that “Windows Server 2003, 128 bit version is installed.” Is this a typo? Is this a level of cryptography required?
The issues section is interesting. It appears that the authors didn’t agree with testing the vulnerabilities they discovered with tools since this lab was titled “exploits without tools” so they chose simply to ignore that requirement.
Team two’s submission this week is sort on analysis, misses some very important points, and has some content issues. The abstract is well written. The last sentence is unclear. Are the systems supposed to institute the recommendations in table form? Your meaning would have been clearer if you said, “fail to follow the recommendations,” or something similar.
The literature review summarizes the articles well but fails to relate them back to the lab. Critical analysis is also rather thin. What is Wales actually advocating? Does he really even discuss tools? You say his stance is weak, but you don’t tell us why. Citing a lack of research question or methods is a cop-out. These things are present, but implied. You say that among the good points he makes, he says that an automated tool report without a security professional is useless. Is this always true? Need someone be a dedicated professional in order to be an expert? What other good points does Wales make? What is the value in what Jajodia et al. do? Are there flaws in their methodology? The group appears to have glossed over the concept of resiliency in DU & Mather’s work. Why? Do you agree with their analysis of the relationship between the application and the environment? You are very critical of Jorgensen. Did you consider that perhaps what he demonstrates is merely proof of concept? Is it applicable to other file types and applications?
There are some problems in the methods section as well. If the literature review is standard practice, does it really need to be mentioned in the Methods section? It’s along the same lines as reporting that the lab was written and submitted via the guidelines. Oh wait you did that. Don’t. The majority of the steps listed are repeatable, however the group never explains how they will test the vulnerabilities, just that they will use the standard environment.
The team’s findings section lacks detail, and relies on fluff to increase word count. We know you did a literature review. Stop referring to that fact in every section. It wasn’t really all that noteworthy. Your findings section relies a great deal on your included table, which does indeed appear to answer the first four questions in the lab. I question placing physical security at layer 0, especially this far into the course. No discussion of patches or roll-ups in included. I didn’t see anywhere that the results in the table were actually tested. Is that the reason for the lack of detail in the methods section on this point?
The conclusion follows the findings, and validates the purpose of the lab in spite of the missing pieces. How did you solve the issue mentioned? It appears to be more of a complaint than issue. Why do you think it is that the lab is about exploits without tools if we discuss tools?
The wording in the first paragraph leads the audience to believe that a security professional needs tools to be more productive, and somehow make more money for that. The team states then states that not only professionals have access to these tools; other people do as well and can get them without paying for them. The question in my mind that this raises is, why should the professionals be getting paid a high salary when other people can use the tools exactly the same way, and don’t need to be paid as much? In the abstract, all acronyms should be spelled out. For APA 5 format, for each new section all acronyms should be spelled out the first time. What constitutes a “typical” installation for Windows Server 2003? I would consider a patched and updated version to be a typical installation. I could see that a Windows XP installation would not be patched considering this is used at home and people might be less likely to install the updates, but one would hope that for servers, they are patched and updated. In the abstract there is some poor grammar “…and using that against those that do what it says”. I am not exactly sure what the last part of this sentence is supposed to say, but there are a lot of pronouns for such a few words.
I would like to see page numbers for these references taken from the required readings so I can look in up myself to see how the team interpreted them. Also, including page numbers is a requirement of APA 5 citations. If you are taking a direct quote from the readings, it must be in quotations. “To that end” seems to be a favorite way to begin a lot of sentences in lab reports. I have been noticing that in the previous 4 lab reports. Titles of articles, books, and journals should be italicized. For everything that Wales talks about, it all seems to be according to other people. Did the team research into all of these others items that Wales keeps talking about? The literature reviews seems to be a string of random thoughts. All of the questions were answered, but it made the literature review less cohesive. The methods section read more like what am abstract should. It talks about what the team will be doing, not the steps down to perform the lab. I am wondering where the list is from step 5. Also step 7 should have had the team create another table. The team seems to be missing about half the lab. There should be 2 tables and a list. What about patches? The one table the team had was hard to read, and was not in order by the layer of the OSI 7 layer model. I think the idea of step 7 was to see what changes were made to protect against the tools that were researched in previous labs.
This team’s literature review did a reasonable job of addressing the articles associated with the lab, including both a comparison of ideas and a critique of concepts contained within the writings. Additionally, the physical layout of the table was appealing to the eye, with concise table categories chosen for presentation. Finally, the length of the results table was appreciable, with good descriptions chosen for vulnerability listings.
It must be said that many problems exist with this group’s treatment of the exercise, however. Noticeably, the report began on a poor footing with an awkwardly composed abstract. The discussion of the “issues with tools” leaves one wondering what point was being made. Furthermore, the phrase “using that against those that do what it says” is at the very least a poor choice of wording; some might call it an abomination. The closing sentence was a fitting end to the section, as it too proved confusing. By all measures, it must be said that this part of the write-up could stand a great deal of correction.
While the literature review was adequate in some regards, it lacked in any specific mention of the nature in which the articles under review applied to the exercise. Perhaps this was unsurprising, as there was little contained within this team’s write-up to apply these concepts to. For instance, the ‘Methods’ section was very brief, and contained no more than a rephrasing of the exercise instruction sheet. Perhaps even more disturbing, though this team spelled out these steps exactly, it appeared to leave the last two steps uncompleted.
The ‘Results’ section was also flawed: at the very least a substantial portion of it should have been located in ‘Methods.’ Some information of worth is presented, but much of the writing suffers from poor wording and redundant phrases. The ‘Issues’ section probably contains information which should have been included in the ‘Methods’ section, too. Amusingly, this team attempted to rationalize the last step of the exercise away, based on a trivial semantic argument. Could “exploits WITHOUT tools” possibly mean that no reconnaissance tools were to be used to determine exploits (the converse of the previous weeks exercise)? The argument presented is silly, even childish: how could ‘testing’ “be outside the outside the scope of the lab” if it was listed as a specific step? In all honestly, I believe that you did yourself a disservice by resorting to this level of excuse making. Finally, the assertion that “by definition no tool is required to exploit… [a] lack of [server] physical insecurity [sic]” was rather ridiculous. It is assumed that ‘security’ was meant; even so, as illustrated by previous exercises, the physical domain necessarily requires physical tools: the implication is that these tools do exist, just not in the form of software.
Finally, some issues exist with the ‘Results’ chart itself. The ordering of layer classifications could have been better; we found it useful to create the tables in Microsoft Excel first: this allows for arbitrary ordering on any table category as needed. Further, this team’s table appears inconsistent in its classification of issues with respect to the OSI layers. For example, I find “LDAP Signing Requirements” listed in layer seven, and later “LDAP Client Signing” listed in layer five. An additional obvious mistake, such as “Syn Attack Protection Level,” which refers to TCP SYN attacks, was listed as OSI layer three: most certainly out of place. There are a number of other classifications I disagree with, mostly with regard to session management and Server Message Block, but it must be admitted that this is largely in the realm of opinion, as no consistent classification was found in consulting various sources.
Team 2’s abstract is well written and explains what they intend to do in lab 5. In addition they do a nice job of relating back to previous labs literature and tie them into this lab. Perhaps though that part of the abstract should have been included as part of their introduction. The literature reviews appears to be just a summary of the assigned readings and does not tie back to the actual lab. The summaries themselves are good but needed to correlate back to the lab.
The literature reviews gives a good summary of what the authors were trying to convey. All of the questions were answered; however the overall literature review was not very cohesive.
The methodologies section is really short considering all that needed to be accomplished in this lab. Also, the methods section read more like what am abstract should.
Detail how your team intends to meet the requirements instead just restating the requirements. There is material in the findings section that could’ve been put in the methodologies section instead.
In the abstract section, team two stated “Based on all of the literature provided in previous labs, performing a penetration test is all about the tools used. Tools represent the basis for making the security professional’s life simpler, and more productive and therefore higher paying.” However, I have to somewhat disagree for the tester must know what tools that should and should not be used depending on the situation. The tools would also be of little use if the operator cannot get the tools to operate at all or get them to work properly. I noticed that the team referred to themselves as “we”, which is in violation of the APA 5 guidelines. Consider changing this to further improve the scholarship of the paper.
In the literature review section I had to disagree with the statement “The tone of this idea as presented by Wales seems to be that in house IT staff are generally not trained well enough to perform an in depth penetration test and audit. “ Wales seemed to think that a hybrid of in house and external penetration testing was best, for external testers would not be familiar with all aspects of the network as the in house IT personnel.
In the methodology section, the statement “Our strategy for lab five is to follow the requirements of the lab design document and course syllabus to present a complete lab report” was moot, for this was strived for by all groups. The methodology section of team two seemed to be missing how the group would identify and list exploits or threats. However, the findings section seemed to address this omission. Group two seemed to also omit coverage of patches.
Team two’s findings section gave a thorough description of the group’s findings. I was somewhat unclear about the statement “. If that setting is disabled then any system other than domain connected Windows 2000 or greater clients can connect to an SMB share. This includes Apple machines, and Linux machines, since backtrack runs on Linux this seems like an important setting to have enabled.”If the server could be accessed by backtrack it could be used to identify weaknesses in it by penetration testers, but it could also be exploited by malicious individuals, who would use backtrack to identify vulnerabilities that could be and would be exploited. Team two summarized the content of the NIST document by stating “After reviewing a mall sampling of security checks in table one, it became overly obvious that most of these checks while over looked by most, are generally common sense to increase the security of information assets. This is apparent because the first check is server physical security. This is an obvious enhancement to system security.” I was also surprised that group did not describe any patterns that emerged from their table.
Team two ‘s issue section contained but one issue, they did not see the significance of step seven which stated “Using your previous work take the list of issues, and find tools that will be associated with each issue. Create a table or matrix.” Their rationale was that the laboratory was not to contain tools, so the step appeared to be contradictory.
In the conclusion section, I was not sure how team two came up with the conclusion that “The actions recommended by NIST are very rigorous, but will harden a Windows Server 2003 machine against most attacks, especially ones that use “script kiddy” level tools.”
Team 2 begins by stating the purpose of this laboratory assignment; to investigate vulnerabilities without tools. They further state that value is added to the use of the tools by the professionals who analyze the results. They state that they are going to focus on the NIST documentation intended to secure Windows Server 2003.
Team 2 proceeds with a review of the literature assigned for this week. They begin with a discussion of the need for vulnerability assessments and then discuss Vulnerability Assessment Tools (Wales, 2003). I agree with their assessment that the value of the article is questionable. They continue their literature review by discussing Topological Analysis of Network Attack Vulnerability (Jajodia, Noel & O’Berry, 2005). They state that this article discusses value added to a standard NESSUS scan by analyzing the data to determine the possibility of indirect attacks. They further state that vulnerability assessment needs to go beyond an IT audit and include a complete risk assessment, including software applications, hardware and network infrastructure (Wales, 2003). They proceed to discuss Vulnerability Testing of Software System Using Fault Injection (Du & Mather, 1998) and Testing with Hostile Data Streams (Jorgensen, 2003). Team 2 questions the ability to use steganography with files other than GIF or JPEG. (They use the term stenography, however I am assuming that they meant steganography). According to http://www.answers.com/topic/steganography , “Although steganographic messages can be hidden in any kind of digital files, image files, because they contain so much data to begin with, are usually used for digital steganography”. Jorgenson refers to steganography as “In the context of data transmitted over the Internet, data included within that transmission that serves a purpose other than the original purpose of the data transmission is steganographic data.” He refers to this definition to describe any transmission over the internet that contains hostile code within an otherwise benign file. By these definitions, the file type containing the hostile code is irrelevant. They return to their discussion of the article Vulnerability Assessment Tools (Wales, 2003). They discuss the articles statements that say that penetration testing is best left to professionals. They disagree with this statement and believe that training is the key. They cite the article Viruses 101 (Aycock & Barker, 2005) to support their discussion about becoming educated.
In the next section, Team 2 discusses the methods used in this lab. They describe six steps used toward completion of the lab. The first step was to select an operating system security guide and to have the guide approved by the instructor. They chose the Windows 2003 Checklist V6.1.12 from NIST. Their second step was to identify what the changes were designed to accomplish, place the change in its proper location on the OSI 7-layer model and McCumber Cube, consider the changes that the large scale security patches play, and then to test the results against the live lab system.
In the findings section, Team 2 further defines methods used in the lab. One example is the security configurations that they found in their NIST document that were not used in this lab because the configurations did not apply to their test environment. Perhaps this information should have been placed in the methods section rather than the findings section. They proceed to explain a few of the more prominent security recommendations. They further state that they found that most of the security recommendations fall into the realm of common sense.
Team 2 begins with their abstract and explains what they will be doing within this lab. The also give the information that they will be using the National Institute of Standards and Technology documents for Windows Server 2003. They then go onto the literature review and start with a statement. They state that performing a vulnerability assessment is no longer just a good idea, but has become a way of life. Why does the group think that it was not always thought in this way? This statement could be made even stronger by giving this information to back it up. They then continue onto discuss risk assessment and vulnerability assessment and how they should be combined. Does the group think this would be a good standard that should be issued by an organization such as IEEE or NIST? The main thing that could have improved the literature review was that it was broken down into the different articles. There where good points that could have been discussed about from each and those points could have been discussed. Those points could have been support by the articles. Overall it was a good section and then the team moved onto the methodologies. Here the team describes what is going to occur within the lab environment and that there will be two tables made. They really did not explain how they where going to accomplish the tasks but more listed the tasks and stated they where going to use their NIST documentation within the section. They then go on to state their findings from reading the document and then give the definition of vulnerability. Within the first couple of paragraphs I found a spelling error. It was just one word that states mall in stead of small. They did a good job at creating their table but when looking at the findings I was expecting more then just a table. Then the group goes onto discuss issues that they had with the lab and stating they had issue with creating the table to meet the requirements of the lab. Then they go onto their conclusion which includes from securing the server using the document will be able to successfully stop a “script kiddie”. Yeah this is a good point but it does not really go well within the conclusion. Over all there were some issues with the lab but the table and literature review was more of the strong point for the team this lab.
I think that group 2’s write-up for lab 5 was very good. The abstract for this lab was good and accurately described the laboratory. The literary review was good and adequately reviews the material. Group 2 answered all of the required questions for each reading. All of the citing for the literary review was done correctly. For this lab, the group answered all of the required questions and provided a good amount of detail about the NIST document that they used. The group also included a very extensive table that indicates many vulnerabilities found in the document and how they relate to the McCumber Cube. The group also raised an interesting point in their issues and problems section, where they chose not to include security tools in their list, because the purpose of this lab is exploits without tools. Finally, the conclusion was written well and accurately sums up the laboratory.