Abstract
In this lab we review the pros and cons of penetration testing, methods and tools used in penetration testing, and the creation of lab environments for performing penetration testing. One method of penetration testing is a technological, tool-based approach for attacking various points in a system that corresponds to the OSI seven layer model. In addition to the original seven layers, we examine an additional two layers, people and kinetics. We create a matrix that gives examples of tools that could be used for attacks and which OSI layer they correspond to along with the McCumber Cube coordinates that the tool would impact.
Literature Review
The primary focus of the literature in Lab 1 is the creation of various laboratory environments to support penetration testing and an attack-based security strategy. Some of the papers also focus on the benefits of using an attack-based strategy vs. a defensive strategy. These two topics are most certainly connected and a good starting point for our TECH581W class as acceptance of an attack-based strategy for security training is assumed.
The first paper in this set to talk about the benefits of using ethical hacking services and penetration testing is (Gula, 1999)which, though it is a vendor whitepaper, has many interesting points about items that are often overlooked in penetration testing. The publisher, Enterasys, does business in the field of penetration testing so that needs to be taken into consideration when reviewing the contents of the paper. The author doesn’t do as well of a job selling penetration testing as he does pointing out what is often missed in them, assumedly to show that his company’s service is superior. There is no quantitative research given to show that the list of items given are often missed by penetration testers, simply, that they “normally go untested as considered by the author (Gula, 1999, p. 6).” The history of penetration testing along with a good argument on its necessity based on the amount of risk an organization is willing to tolerate from electronic intruders is discussed in (Coffin, 2003). The author shows a very simple example of what a penetration test may cost vs. down time for an electronic commerce site. Of special note is the author’s inclusion of the limitations of penetration testing and ethical hacking. Just because a site may not be compromised by the testers does not necessarily mean that it is secure, just that the particular tester did not find any vulnerabilities at that point in time with their specific skill set. In (Arce & McGraw, 2004), the authors discuss the various aspects of attack-based penetration testing and acknowledge that “the majority opinion is that the only way to properly defend a system against attack is to understand what attacks look like as deeply and realistically as possible (p.17).” The authors acknowledge the prevalence of the buffer overflow attack and discuss new, more advanced methods of buffer overflow attacks. Also important to a red team in a penetration test is documentation of the methods employed and their results. In (Ray, Vemuri, & Kantubhukta, 2005) the authors describe a method of using XML code to make their attacks reusable for other engagements. This not only creates a standardized form of documenting what was done but allows for easier sharing of the information with other testers in the organization.
The one paper in this first group of readings that detailed weaknesses in penetration analysis was (Du & Mathur, 1998, p. 2). The authors suggest that penetration testing requires one to know about flaws that might exist already in the system which can significantly affect the validity of the test, and this is certainly something that cannot be overlooked. As mentioned in (Coffin, 2003), an unsuccessful penetration test does not mean that the system is completely secure. Du & Mathur (1998) introduce the idea of fault injection into systems. This method of testing attempts to cause the system to fail by injecting faults and bad data. The authors present a model for identifying areas that would be good for testing and a methodology for analyzing results. Fault injection complemented by penetration testing would yield optimal results but would require separate teams of experts for each part.
The bulk of the readings in Lab 1 discuss the creation of lab environments and curricula to facilitate training security practitioners in the field of penetration testing.
(Holland-Minkley, 2006; Micco & Rossman, 2002; Mink & Freiling, 2006) all discuss teaching attack-based security classes and the necessity of having a lab for testing. While (Micco & Rossman, 2002) was primarily concerned with detailing issues that may arise when running a cyberwarfare class, the other two papers were concerned with the “success” of the class which they based on empirical evidence gathered from students. Another topic raised in these papers regarding the curricula was getting permission from the institutions to run a class that could train potential attackers of their networks. One important thing stressed in the classes was ethics and responsibility surrounding the knowledge gained through the class. In (Mink & Freiling, 2006) the basis of the article goes over should student learn about attacking systems and what can be learned from it. It also discusses is defense more important than attacking. Upon reading the paper they also created a method to measure the results of this type of learning but it strays away from what the tile of the paper is. When they say “is attacking better than defense” it is thought that the method would measure the benefits and disadvantages between attacking and defense. As with security and usability it would not be possible to have one without the other and it all depends on the circumstances the professional is in where the importance of one may take over the other.
Due to the nature of penetration testing, many of the tools used can be harmful to normal operation of systems. Because of that, when teaching ethical hacking classes, lab environments need to be designed so that they do not impact the rest of the organization’s systems or even the Internet at large. The course instructors in (Micco & Rossman, 2002) designed a lab that was cut off completely from the rest of the school’s network and only had an Internet connection through a modem with a special agreement with the internet service provider. In the DETER testbed, the organizers designed a system that was accessible remotely to multiple users that could handle running simultaneous, isolated tests
(Benzel et al., 2006, p. 2). Another method of providing an isolated environment for the purpose of penetration testing is virtualization. In (Heien, Massengale, & Wu, 2008, p. 76) the authors utilize Qemu to create a virtualized environment with multiple clients and subnets all without the need for any additional hardware. VMWare was evaluated in (Micco & Rossman, 2002, p. 24) but the authors thought it would be of little benefit. They chose Linux because it was free over NT being “much more expensive.” creating a homogenous testbed which at the time probably only represented a very small fraction of the operating system market. The authors do concede that a heterogeneous network would have been much more realistic. Our lab environment for this class will consist of an isolated network between hosts running in VMWare. This will allow us to maximize department resources and run multiple virtual hosts simultaneously without interrupting the other teams.
Lab Exercise
In addition to creating our lab environment, we were also tasked with creating an attack matrix based on a 9 layer OSI model combined with the McCumber Cube (McCumber, 1991). By examining the layers of the OSI model we are able to identify attack tools that could exploit a system at each of those levels. By adding the McCumber Cube coordinates it adds an additional layer of visibility in to what areas an organization may need to focus its efforts on. We examined tool suites such as Backtrack combined with Internet search engines and our personal knowledge to compile the list.
OSI Layer |
Tools |
McCumber Cube Coordinate |
People/8 | Ancestry.com, Careerbuilder, Dropbox, Dice.com, Facebook, Monster.com, Myspace, Twitter, Spy | Confidentiality, Storage, People |
Saboteur | Integrity, Storage, People | |
Application/7 | Absinthe, Checkpwd, Cisco Auditing Tool, Cisco Global Exploiter, Dig,Dnstracer, Dnsmap, Finger Google, Fuzzer, GFI LanGuard, Hydra, JBroFuzz, John the Ripper, Lodowep, Maltego, Medusa, Metagoofil, Metasploit, Metoscan, Mibble MIB Broswe, Mistress, Nessus, Nikto, Nmbscan, Oracle Auditing Tools, PBNJ, Peach, PStoreView, RevHosts, RPCDump, Sidguess, SMTP-Vrfy, SQLbrute, Sqldict, Subdomainer, VNC_bypauth, whoami, WyD, | Confidentiality, Storage, Technology |
Achilles, Dmitry, Halberd, Httprint, Mezcal, Relay Scanner, TFTP-Brute, VNCrack, | Confidentiality, Transmission, TechnT | |
XSpy | Confidentiality, Processing, Technology | |
Sqlupload, Tini, Zodiac | Integrity, Storage, Technology | |
Pirana, Wapiti | Integrity, Transmission, Technology | |
Barrier | Integrity, Processing, Technology | |
Dnswalk | Availability, Storage, Technology | |
Bed, Cisco Torch | Availability, Transmission, Technology | |
Presentation/6 | OpenSSL-Scanner | Confidentiality, Storage, Technology |
Squirtle | Integrity, Processing, Technology | |
Session /5 | Fierce, Mbenum, PcapSipDump, PcapToSip_RTP, SIPcrack SIPDump, SIPp, SIPSak | Confidentiality, Transmission, Technology |
FPort | Integrity, Processing, Technology | |
CIRT Fuzzer, ICMPTX, Juggernaut, NSTX | Availability, Transmission, Technology | |
Network/3 | Afrag, Airbase-ng, Aircrack-ng, Airdecap-ng, Airodump-ng, Airmon-ng, Airplay-ng, Airpwn, airoscript, AirSnarf, Angry IP Scanner, ASLeap, Autonomous System Scanner, Autoscan,Cheops-ng, EtterCap, Fping, IKE-Scan, IKEProbe, itrace, Mailsnarf, Netdiscover, Netenum, Netmask, Nmap, PHoss, Ping, Protos, PSK-Crack, Psyche, ScanLine, Wireshark | Confidentiality, Transmission, Technology |
IRDP Responder | Integrity, Transmission, Technology | |
Data Link/2 | Bluesnarfer, BTcrack, BTscanner, Redfang | Confidentiality, Transmission, Technology |
Bluebugger, Blueprint, Bluesmash, Cain & Abel, Carwhisperer, File2Cable, Gobbler, HCIDump, MacChanger, MDK3, Minicom, ObexFTP, SMAC, Smap, Ussp-Push, Wellenreiter, Wicrawl, WifiTap, WifiZoo | Integrity, Transmission, Technology | |
Physical/1 | Fiber tap, Hardware Keylogger,”Pringles can” antenna, TEMPEST, Wiretapping | Confidentiality, Transmission, Technology |
Kinetic/0 | Arson, Nuclear Bomb, Ship Anchor, Shovel | Availability, Transmission, Technology |
EMP | Availability, Processing,Technology | |
Dumpster Diving | Confidentiality, Storage, Process | |
Lock Pick | Confidentiality, Storage, Technology |
Findings
Since the original seven layer OSI model exists inside a computer system, the majority of the tools identified fall in the technology area of the McCumber cube rather than people or process. Any interaction which requires cognition and action on the part of the user would move into the people or processing area of the McCumber cube. Based on this one might see a substantial bias towards a tool-based, technological approach to penetration testing and, while that is certainly an important part of the testing procedure, a majority of the areas to attack are outside of the technology realm and, in many cases, are substantially easier to attack. If this matrix were presented to a customer, it should be noted by the testers that it’s only an aid in seeing where these technological tools affect their organization, not a breakdown of the threats and vulnerabilities they may be exposed to.
Issues
Many of the tools found in the list from Backtrack could fall in to multiple categories so we took the approach of placing the tool where we felt it was best suited for impacting confidentiality, integrity, or availability. Essentially, where we felt the tool would do the most damage. One of the most difficult placements was Wireshark since this tool can operate at layers 1 through 7. We felt that by placing it at layer 3 we would be able to monitor a vast majority of network traffic since most is IP based and use that common factor for working up the stack when necessary. Another issue encountered was a misunderstanding of the amount of tools to be listed in the threat matrix. This was noted by the instructor before submission and required a rebuild of the original table.
Conclusions
This lab was a serious mental exercise that stretched our thinking patterns on how we view tools. By placing such a wide variety of tools into a matrix we saw patterns develop at different layers of the OSI model. We saw a heavy focus on tools that affected confidentiality and data storage in the application layer while we saw tools affecting confidentiality and data in transmission in layers 3 and 4. These findings would be useful when developing a plan for penetration testing by showing where to concentrate efforts and what tools to use when attacking the various layers of an organization’s security.
Works Cited
Arce, I., & McGraw, G. (2004). Guest Editors’ Introduction: Why Attacking Systems Is a Good Idea. Security & Privacy, IEEE, 2(4), 17-19.
Benzel, T., Braden, R., Kim, D., Neuman, C., Joseph, A., Sklower, K., et al. (2006). Experience with DETER: A Testbed for Security Research. Paper presented at the Proceedings of the 2nd IEEE Conference on Testbeds and Research Infrastructures for Development of Networks and Communities.
Coffin, B. (2003). It Takes a Thief: Ethical Hackers Test Your Defenses. Risk Management Magazine, 50, 10-14.
Du, W., & Mathur, A. (1998). Vulnerability Testing of Software System Using Fault Injection. Purdue University.
Gula, R. (1999). Broadening the Scope of Penetration-Testing Techniques: Enterasys Networks.
Heien, C., Massengale, R., & Wu, N. (2008). Building a Network Testbed for Internet Security Research. Paper presented at the Sixth Annual Consortium for Computing Sciences in Colleges Mid-South Conference.
Holland-Minkley, A. M. (2006). Cyberattacks: A Lab-based Introduction to Computer Security. Paper presented at the Proceedings of the 7th Conference on Information Technology Education.
McCumber, J. R. (1991, October, 1991). Information Systems Security: A Comprehensive Model. Paper presented at the 14th National Computer Security Conference.
Micco, M., & Rossman, H. (2002). Building a Cyberwar Lab: Lessons Learned: Teaching Cybersecurity Principles to Undergraduates. Paper presented at the Proceedings of the 33rd SIGCSE Technical Symposium on Computer Science Education.
Mink, M., & Freiling, F. C. (2006). Is Attack Better than Defense?: Teaching Information Security the Right Way. Paper presented at the Proceedings of the 3rd Annual Conference on Information Security Curriculum Development.
Ray, H. T., Vemuri, R., & Kantubhukta, H. R. (2005). Toward an Automated Attack Model for Red Teams. Security & Privacy, IEEE, 3(4), 18-25.
Links
Layer 8
Ancestry.com –http://www.ancestry.com
Careerbuilder.com – http://www.careerbuilder.com/
Dice.com – http://www.dice.com
Dropbox – http://www.getdropbox.com
Facebook – http://facebook.com
Myspace – http://www.myspace.com/
Monster – http://www.monster.com/
Twitter – http://twitter.com
Layer 7
Absinthe – http://www.0x90.org/releases/absinthe/docs/basicusage.php
Achilles – http://www.mavensecurity.com/achilles
Barrier – http://www.erratasec.com
Bed – http://snake-basket.de/bed.html
Checkpwd – http://www.red-database-security.com/software/checkpwd.html
Cisco Auditing Tool – http://www.infosecpro.com/penetrationtest/p73.htm
Cisco Global Exploiter – http://www.vulnerabilityassessment.co.uk/cge.htm
Cisco Torch – http://www.arhont.com/ViewPage7422.html?siteNodeId=3&languageId=1&contentId=-1
Dig – http://linux.about.com/od/commands/l/blcmdl1_dig.htm
Dmitry – http://www.mor-pah.net/
Dnstracer – http://www.mavetju.org/unix/dnstracer.php
Dnswalk – http://sourceforge.net/projects/dnswalk/
Dnsmap – http://unknown.pentester.googlepages.com/
Finger Google – http://unknown.pentester.googlepages.com/
Fuzzer – http://www.securiteam.com/tools/5TP012AHFU.html
GFI LanGuard – http://www.gfi.com/lannetscan/
Halberd – http://halberd.superadditive.com/
Httprint – http://net-square.com/httprint/
Hydra – http://www.thc.org/
JBroFuzz – http://www.owasp.org/index.php/Category:OWASP_JBroFuzz
John the Ripper – http://www.openwall.com/john/
Lodowep – http://www.cqure.net/wp/?page_id=17
Maltego – http://www.paterva.com/maltego/
Medusa – http://www.darknet.org.uk/2006/05/medusa-password-cracker-version-11-now-available-for-download/
Metagoofil – http://www.edge-security.com/metagoofil.php
Metasploit – http://www.metasploit.com/
Metoscan – http://www.securiteam.com/tools/5CP0O20IAK.html
Mezcal – http://0x90.org/releases/mezcal/
Mibble MIB Browser – http://www.mibble.org/
Mistress – http://packetstormsecurity.org/fuzzer/mistress.rar
Nessus – http://www.nessus.org
Nikto – http://www.cirt.net/code/nikto.shtml
Nmbscan – http://nmbscan.gbarbier.org/
Oracle Auditing Tools – http://www.cqure.net/wp/?page_id=2
PBNJ – http://pbnj.sf.net/
Peach – http://peachfuzz.sourceforge.net/
Pirana – http://www.guay-leroux.com/projects/SMTP%20content%20filters.pdf
PStoreView – http://www.ntsecurity.nu/toolbox/pstoreview/
Relay Scanner – http://www.cirt.dk/
RevHosts – http://www.revhosts.net/
RPC Dump – http://www.cultdeadcow.com/tools/rpcdump.php
Sidguess – http://www.red-database-security.com/whitepaper/oracle_guess_sid.html
SMTP-Vrfy – http://jeremy.kister.net/code/perl/vrfy.pl
SQLbrute – http://www.justinclarke.com/archives/2006/03/sqlbrute.html
sqldict – http://www.vulnerabilityassessment.co.uk/sqlat.htm
sqlupload – http://www.vulnerabilityassessment.co.uk/sqlat.htm
Subdomainer – http://www.edge-security.com/subdomainer.php
Tini – http://www.ntsecurity.nu/toolbox/tini/
VNC_bypauth – http://www.securityfocus.com/bid/17978/exploit
VNCrack – http://phenoelit-us.org/vncrack/docu.html
Wapiti – http://wapiti.sourceforge.net
WyD – http://www.remote-exploit.org/codes_wyd.html
Yersinia – http://www.yersinia.net/
XSpy – http://www.acm.vt.edu/~jmaxwell/programs/xspy/xspy.html
Zodiac – http://www.packetfactory.net/zodiac/
Layer 6
OpenSSL-Scanner – http://cve.mitre.org/cgi-bin/cvename.cgi?name=CAN-2002-0656
Squirtle – http://code.google.com/p/squirtle
Layer 5
Fierce – http://ha.ckers.org/fierce/
Marathon – http://www.codeplex.com/marathontool
Mbenum – http://www.cqure.net/wp/mbenum/
PSTools – http://technet.microsoft.com/en-us/sysinternals/default.aspx
SIPcrack – http://www.remote-exploit.org/codes_sipcrack.html
SIPDump – http://www.remote-exploit.org/codes_sipcrack.html
SIPp – http://freshmeat.net/redir/sipp/49242/url_homepage/sipp.sourceforge.net
SIPSak – http://sipsak.org/
Smap – http://www.wormulon.net/smap/
Layer 4
0trace – http://lcamtuf.coredump.cx
Amap – http://www.thc.org/thc-amap/
CIRT Fuzzer – https://www.buslab.org/index.php/content/view/45743/2/
CryptCat – http://cryptcat.sourceforge.net/
Driftnet – http://www.ex-parrot.com/~chris/driftnet/
Dsniff – http://monkey.org/~dugsong/dsniff/
Etherape – http://etherape.sourceforge.net/
Firewalk – http://www.packetfactory.net/projects/firewalk/
FPort – http://www.foundstone.com/us/resources/proddesc/fport.htm
Hping – http://www.hping.org/
Hping2 – http://sourceforge.net/projects/hping2/
Hping3 – http://gd.tuwien.ac.at/www.hping.org/hping3.html
ICMPTX – http://thomer.com/icmptx/
Iodine – http://code.kryo.se/iodine/
InTrace – http://www.swiecki.net/
Juggernaut – http://www.securiteam.com/tools/2NUQBSAQ0C.html
Matahari – http://matahari.sourceforge.net/
Netcat – http://netcat.sourceforge.net/
P0f – http://lcamtuf.coredump.cx/p0f.shtml
Privoxy – http://www.privoxy.org/
ProxyTunnel – http://proxytunnel.sourceforge.net/
Taof – http://sourceforge.net/projects/taof
TcPick – http://tcpick.sourceforge.net/
TCPtraceroute – http://michael.toren.net/code/tcptraceroute/
TCtrace – http://phenoelit-us.org/irpas/docu.html#tctrace
Scanrand – http://www.secureworks.com/research/articles/scanrand
SinFP – http://sourceforge.net/projects/sinfp/
Spike – http://www.immunitysec.com/resources-freesoftware.shtml
SuperScan – http://www.foundstone.com/us/resources/proddesc/superscan.htm
UnicornScan – http://www.unicornscan.org/
XProbe2 – http://xprobe.sourceforge.net/
Layer 3
Afrag – http://homepages.tu-darmstadt.de/~p_larbig/wlan/
Aircrack-ng – http://www.aircrack-ng.org/
Airdecap-ng – http://www.aircrack-ng.org/doku.php?id=airdecap-ng
Airplay-ng – http://www.aircrack-ng.org/doku.php?id=aireplay-ng
Airmon-ng – http://www.aircrack-ng.org/doku.php?id=airmon-ng
Airpwn – http://airpwn.sourceforge.net/
AirSnarf – http://airsnarf.shmoo.com/
Angry IP Scanner – http://www.angryziber.com/w/Home
ASLeap – http://asleap.sourceforge.net/
Autonomous System Scanner – http://phenoelit-us.org/irpas/docu.html#ass
Autoscan – http://autoscan.free.fr/
Cheops-ng – http://cheops-ng.sourceforge.net/
Ethereal – http://www.ethereal.com/
Ettercap – http://ettercap.sourceforge.net/
Fping – http://www.fping.com/
itrace – http://phenoelit-us.org/irpas/docu.html#itrace
IKE-Scan – http://www.nta-monitor.com/tools/ike-scan/
IKEProbe – http://www.securityfocus.com/infocus/1821
IRDP Responder – http://phenoelit-us.org/irpas/docu.html#irdpresponder
Mailsnarf – http://www.irongeek.com/i.php?page=backtrack-3-man/mailsnarf
Netdiscover – http://nixgeneration.com/~jaime/netdiscover/
Netenum – http://phenoelit-us.org/irpas/docu.html#netenum
Netmask – http://phenoelit-us.org/irpas/docu.html#netmask
Nmap – http://www.insecure.org/nmap
PHoss – http://phenoelit-us.org/phoss/docu.html
Ping – http://www.hmug.org/man/8/ping.php
Protos – http://phenoelit-us.org/irpas/docu.html#protos
PSK-Crack – http://www.nta-monitor.com/tools/ike-scan/
Psyche – http://psyche.pontetec.com/
ScanLine – http://www.foundstone.com/us/resources/proddesc/scanline.htm
Wireshark – http://wireshark.org/
Layer 2
BlueBugger – http://www.remote-exploit.org/codes_bluebugger.html
Blueprint – http://trifinite.org/trifinite_stuff_blueprinting.html
Bluesmash – http://sourceforge.net/projects/bluesmash/
Bluesnarfer – http://www.alighieri.org/project.html
Btcrack – http://www.nruns.com/_en/security_tools_btcrack.php
Btscanner – http://www.pentest.co.uk/
Cain & Abel – http://www.oxid.it/cain.html
Carwhisperer – http://trifinite.org/trifinite_stuff_carwhisperer.html
File2Cable – http://phenoelit-us.org/irpas/docu.html#file2cable
Gobbler – http://gobbler.sourceforge.net/
HCIDUMP – http://www.linuxcommand.org/man_pages/hcidump8.html
MacChanger – http://alobbs.com/macchanger/
MDK3 – http://homepages.tu-darmstadt.de/~p_larbig/wlan/
Minicom – http://alioth.debian.org/projects/minicom/
ObexFTP – http://triq.net/obexftp.html
PcapSipDump – http://sourceforge.net/projects/psipdump
PcapToSip_RTP – http://wiki.cdyne.com/wiki/index.php?title=PCAP_To_SIP_and_RTP
Redfang – http://www.net-security.org/software.php?id=519
Ussp-Push – http://www.xmailserver.org/ussp-push.html
Wellenreiter – http://www.wellenreiter.net/
Wicrawl – http://midnightresearch.com/projects/wicrawl
Wifi Tap – http://sid.rstack.org/index.php/Wifitap_EN
WifiZoo – http://community.corest.com/~hochoa/wifizoo/wifizoo_v1.2.tgz
The group starts out with a short abstract. The abstract quickly explain that they reviewed the pros and cons of penetration testing, tools used, and a creation of a lab environment for penetration testing. Then the group explains that this lab is going to use an approach to penetration testing using the OSI model with two extra layers attached to the model (people and kinetics). They quickly mentioned the table that will be created to categorize the tools found into the OSI model and the McCumber cube. I think that this abstract could have done a better explanation of what this lab is about. Next the group did their literature review. The literature review starts off explaining that the focus of the articles in this lab was to produce various laboratory environments to support penetration testing and an attack-based security strategy. Then the group goes into explaining each paper. The group takes all the readings and compares them to each other under different categories of discussions. They do a nice job of comparing and contrasting the readings this way. The problem that I ran into is that they didn’t compare the readings to this lab except for a couple of lines at the bottom of the review and even then they just explain briefly what they are going to do in this lab. I also didn’t see any were that the group explained the question of the paper or the methodology. Next the group talked about the method of the lab. They just mentioned in passing that they were creating a lab environment. The group did not go into any details on how they were going to set up the lab, what operating systems they were using and what they had to do to configure their machines in their environment. The group did explain that they were creating a table that categorizes a bunch of tools into the OSI model and the McCumber cube. Another step that was missed was the mention of answering the two questions. The table that they produced was set up nicely. They categorized the tools in a way that made it clear that there is a bias. After the table the group goes into the finding of the lab. In the findings the group does answer the question of why the tools fall under technology in the McCumber cube. They state that the tools fall under technology because they exist in a computer system. I say that not all tools exist on a computer but they do still fit under technology. They also answered the question of biasing in penetration testing. They state that because the tools fit under the technology category there tends to be a bias toward the tools than with attacks using tools outside the technology realm. They also warn to anyone that may use this list that the list is only an aid and not a breakdown of the threats and vulnerabilities. Again the group does not mention any were about the set up of their lab environment. Next the group talks about the issues they had. The group mentioned the problem of being able to fit most tools into more than one category, which I think all groups had a problem with. Also they mention the problem of not knowing how many tools to use in this lab. Last the group wrote their conclusion. In the conclusion the group stated that creating this table of tools the group was able to see patterns in the way tools affect different categories of the McCumber cube.
The abstract was well written. It clearly stated what is going to be done in the lab as well as what the group will accomplish in the process of the laboratory experiment. This group put the required readings into different categories. This helped compare the papers to other into the same categories as well as the ease of comparing them to other papers. The group hit on every point that a literature review should as well as clearly citing the paper numbers for the references that they used in their literature review. The group is missing one section, the steps of the process. I think that this section is needed even though every group had the same steps to set up the Citrix environment, providing that the group chose to do that option and not use their own equipment. I like how the group put the table together by grouping all the tools that share the same OSI layer and McCumber cube dimension into one row. This made reading the table much easier than most of the other groups’ tables. The group did not find any tools that fit into layer 4 (Transport) in their table. I would like to have seen some tools in there. This group had some of the best tools in layer 0. Most of the other groups tried to find tools that dealt with technology to fit into this layer, while this group got imaginative with their list.
The group did not talk about the differences between Ethereal and Wireshark. They did however talk about the other questions that were presented in the description of the laboratory experiment. I agree with their reason as to why some many of the tools that the groups would discover fit into the technology dimension of the McCumber cube. More than likely, when using technology as the attack method, technology is usually going to be the item affected while policies and the human factors will not be affected as much. I find it actually quite calmly to see that this group like many of the others had issues determining the OSI layer and the McCumber cube dimension. The group’s only other issue was the confusion that the other groups had with the number of tools that were needed to be found in the table with all of the information. This group had one of the best conclusions. In it, the group stated that this was a mental exercise. I agree with this statement for a variety of reasons. I also thought that this group had some confusion with this lab and determining how to put it together. Other groups had the same problems, which I think was all part of the exercise of this laboratory experiment. This group realized that it was important to get this research done right and out of the way now for the ease of future labs.
The group gives a fairly decent overview of the lab, but it lacks depth. Do you agree or disagree with the various authors in the literature review? Why? In your introduction the group states that the lab is all about creating the test environment, and you relate the literature back to this point, but never actually mention building an environment. The conclusion states that the lab was about discovering the tools. Which is it?
I also have some questions and concerns about the results section. In layer 8 you list “saboteur”. Isn’t this really an operator who specializes in physical (layer one) attacks rather than a specific attack aimed at the “people” layer? Several of your layer 0 tools do indeed have a kinetic effect, but I think this misses the mark. Don’t they really attack the physical layer? In discussing your findings the group states that the majority of the tools available do not fall in the technical realm, but your table appears to show otherwise. How do you reconcile this difference? The group states “Any interaction which requires cognition and action on the part of the user would move into the people or processing area of the McCumber cube. “ Processing is on a different plane from people and technology. Did you mean policy? In any case, I disagree with this statement. In order for technology to be used, there must be cognition and interaction on the part of the user. There is no such thing as a closed system.
On a final note, I would be careful about fact checking. Entrasys does a lot more than penetration testing, and it’s not their main focus.
Team 5
I think team 5’s abstract was well written and defined what they were going to do in their lab excercise. I liked how they organized their lit review, it made it very easy to read and understand how the articles tied into the lab excercise. The group is missing the set-up steps of the process. Nick did a good job of documenting the process. It would have been nice to hear how team 5 interpreted how the process was set up. I thought the way team 5 put the table together by grouping all the tools together that share the same OSI layer and McCumber cube was personally, very helpful to me. I didn’t see any tools in layer 4, not sure if they missed that or just couldn’t indentify them. I liked the tools they listed for layer 0. Much more creative than ours and the other groups. I agree entirely with their findings, issues, and conclusions. Overall the paper and lab was well done.
The fifth team also presented a complete and well thought out lab exercise. The lab met most of the requirements as per the syllabus. There were no real apparent issues or problems that stuck out at first examination. However, there was one item that could be improved upon. The abstract did not meet the requirements of the syllabus in terms of length. Team five presented a very cohesive and well organized literature review, meeting the requirements set forth in the syllabus. In text APA 5 style citations were included, all articles reviewed and given careful consideration as to where the fit in both in the literature review itself as well as in the lab. This team presented their review of the literature as being divided into two (or possibly three) sub-categories, one dealing with the strength and weakness of penetration testing, one dealing with the creation of a proper penetration testing lab, and possibly one more dealing with the tools used in penetration testing by previous authors. This style of literature review is different than any of the other labs reviewed here, and does provide a very good understanding as to the state of the body of existing literature on the required topics. The methods were detailed as to how the taxonomy was created, as well as how the questions presented in the lab were answered. One lacking item was in the technical setup of the lab in VMware itself, there was no mention of setting IP addresses on virtual machines, as was in all the other labs. The items in the taxonomy agreed with the other labs for layers one through seven and the items in layers zero and eight were well considered and agreed with the labs from team three and one. The questions asked in the lab were answered and those answers also agreed with the other labs. This lab also agrees with team three and one in many of the topics, ideas, and short comings of the literature presented for this lab. The technical merit of team five is easy to judge for the taxonomy as being well thought out, and accurate. Where it cannot be considered is in the VMware lab setup, as that section is missing. The lab approach taken by team five is different than the other approaches, as it more cohesive then team one and team four, and categorized differently than team three. The only real enhancement that could be made in the future is in the abstract length, and making sure all sections are accounted for. The only additional materials that may be needed focus around group communication, just as in all the other groups. Team five’s methods need no changes. The area of most interest to this reviewer is in the conclusion and issues sections. Stating that not knowing where to place Wire shark is a good issue I’m sure all had but was not recorded by any other team, as well as the fact that lab one was a very mentally taxing activity
@Borton. The statement about Enterasys doing business in the field of penetration testing meant only that it was a part of their business. This would be like saying that McDonalds does business in the field of selling french fries but I can still always go pick up a BigMac too.
@jverburg. Well said.
I feel that this is a very soundly written lab write-up. The literature review is nicely done. Perhaps I am inferring signals which really are not present in literature review, but it seemed amusing that the first ‘vendor whitepaper’ received discernibly more directed criticism than the other papers-prior negative experience with said vendor possibly? I found the literature review closely connected to the lab procedure-well done. I also thought the remainder of the write-up was well conceived. A fairly comprehensive description of procedure was presented, a discussion of results and issues was included, and a nice summary put forth in the conclusion section. Finally, the organization of the tool table was very nice, well researched, and presented with a thorough list of web links.
Now, for thoroughness, the (few) deficiencies found must be presented. It was noticeable that the procedures for tool discovery and evaluation were detailed, but no mention of the steps taken to construct the test environment was made. In fact, only a generic reference to “creating the lab environment” is found in the ‘Lab Exercise’ section, with no mention made thereafter. What considerations were made when designing the environment? Is it a VMware based system, or were the resources located to create a small ‘real iron’ setup on some isolated LAN subnet? What operating system variants were chosen for the test setup? All these are pertinent questions which should probably have been addressed.
So much for the procedure critique, as I really wish to address this teams ‘lab question’ answers. With relation to the first ‘technology’ question, I take issue with two ideas presented in this write-ups answer. First, I would argue, contrary to the assertion made, that ‘cognition’ is ‘not’ strictly limited to the domain of ‘human’ and ‘policy/practice’ exploits. Rather than rehash the entire argument again, I invite those interested to examine the review of group two, in which I detail my logic. I say in brief summary: it appears that ‘technology’ can only be effectively employed as a tool in conjunction with the cognition required in its creation and direction-therefore it does not differ substantially in this regard from the other categories. It is then obvious that I will take exception to the second assertion, that an associated property of ‘cognition’ automatically classifies an exploit into the ‘policy/process’ or ‘people’ sectors of the McCumber cube. I would simply say that I cannot agree with this based on the prior point.
As to the second question, I think the assertion that the majority of truly effective exploits lie outside of the realm of technology is an excellent point. I did not discern a direct reason as to why most tools used are ‘self-selected’ into the technology area, however. I believe the implication in the answer was that in some way, direct human interaction was undesirable within the scope of penetration testing-but no reason was given as to why this might be true. I might suggest that ethical considerations and legal liability are valid reasons not to use ‘human’ and ‘policy/practice’ tools. Again, very nice write-up; I appreciated the scope of the discussion sections.
Overall team 5 has put together a very good document. The document review is very thorough and touched on some points that I missed when preparing mine. I particularly like that fact that the author provided links to the various tools that were reviewed. That is something that our group overlooked.
I believe that it is a good point that that the author mentions that Gula, 1999 paper is a vendor whitepaper and as such may be biased to support their work. It is also good that the author points out that there is no supporting documented research concerning the areas that he states are often missed by penetration testers, and that these are merely an opinion of the author.
I like that the author quotes the paper that “the majority opinion is that the only way to properly defend a system against attack is to understand what attacks look like as deeply and realistically as possible”. This is the supporting argument for penetration testing and the very nature of this course.
There are a few changes that I would make to this document however. For one thing it states in the introduction, “we reviewed the pros and cons of penetration testing”. Perhaps it would have been better worded as “benefits and limitations” rather than “pros and cons”. This document also states, ”As with security and usability it would not be possible to have one without the other..”. Since increasing usability has an inverse effect on security and vice-versa, perhaps it would be better stated that there is a trade-off between the two rather than stating that you can’t have one without the other.
Some other things I might change have to do with the classifications within the OSI model. Layer 7 of the OSI model has to do with the applications that use the network, such as FTP applications, Telnet applications, web browsers, etc. For example, this paper describes DNS Tracer is shown in the application layer. Perhaps it would be better placed in network layer since it is concerned with tracing domain servers, which are in part responsible for directing network traffic. Another example is PBNJ. PBNJ is a network scanning tool, and therefore likely doesn’t belong in the application layer. I believe it would go into layer three, the network layer.
Mbenum is a tool that obtains information from the master browser about what services they run, such as Terminal Services or SQL Server. As such I believe it would belong in the application layer of the OSI model rather than the session layer since its vector of attack is the application directly.
I agree that many of the tools can belong to multiple categories. For example, a tool that scans for open ports on a network and uses that to cause a buffer overrun within an application operates at both the network and application layers. However, its initial point of attack is the network.
Altogether I believe group 5 has put together a well written and complete document, particularly considering our time constraints. There were only some minor changes that I would make, and those changes are mostly a matter of my opinion.
In the abstract, I agree with what your team meant by “In this lab we review the pros and cons of penetration testing,” Most of the articles just addressed the positive aspects of penetration testing or red teaming, but there were a few that did indeed cover some of the negative aspects of doing this type of testing.
I noticed that Team 5 also did not include a method section. One of the lab questions had us address the biases of using attack tools, which some groups interpreted as adding our biases where the tools should be placed in the table, while others applied the question to the ability to realistically use the tools in a virtual environment. I also noticed that your team did not mention anything at all on how you set up or modified your virtual environment.
In the literature review section I partially agree with team 5 when they stated “The primary focus of the literature in Lab 1 is the creation of various laboratory environments to support penetration testing and an attack-based security strategy.” However, the majority of the articles make the justification for the need to use offensive security techniques; hence the justification for red teaming and how it is a good alternative for teaching students about security besides traditional defensive techniques that are the norm in network or computer security programs. Team 5 needed to address the research questions in the articles, providing that they had such statements, the methodology used, how each article related to the lab assignment. The team did do a good job summarizing the supporting data and did point out a few errors or omissions in some of the articles such as the conflict of interest when Gula pointed out what is normally missing in most penetration tests, which implied his company’s penetration tests were free of such mistakes.
There were a few discrepancies discovered in the section that had the exploits table. The table was missing the technology column, which would explain what particular technology that exploit or attack tool would effect. The Transport layer of your table appeared to be completely missing. . Some of the sections such as the Presentation layer did not contain the required number of 30 exploits. However, I liked the efficiency of your group by placing all of the related tools into one single box. While some attack tools could be used to attack different layers of the OSI model depending upon their functionality, some of the tools appeared to be in the wrong layers. Dumpster diving would go under the People layer of the OSI model, not the Kinetic layer. In Layer 8, your group mentioned a saboteur as a layer 8 vulnerability. I agree that that it would fit but when it comes to the McCumber cube portion of the table, a saboteur would affect availability more than integrity. Was the Kinetic layer for exploits that used computers to physically affect another computer or a system or machine that were connected to a network such as devices connected to SCADA controllers? Technically would anchor knocking out Internet be a physical layer attack, not a kinetic attack?
In your works cited section, I thought it was good of your group to include the links to the different attack tools.
The abstract was clear and concise on what was going to happen in the paper. They mention the about the reviewing the pros and cons about penetration testing but did not see where they took a stand on if they were for or against. There literature review was a good but more details could have been given. The lab exercise was more then the matrix chart on the 9 layer OSI model. There chart was very well put together, especially when combining the multiple tools together that all fall under the same OSI layer and McCumber Cube Coordinate. This made the chart easier to read and understand. The Links was a great way to others to view were the group received there information to put the tool in the matrix chart. In the issues, the group talks about the tools falling into more then one categories. Most tools are written to be multi-functional. In the directions it does say to put them in the place were the tool would fit the best. The group had a good idea about placing the tools in the area where it has the potential to be doing the most damage.
I think that group 5’s write-up for lab 1 is very good overall. The abstract for this lab was very well written. The literary review was good. Although, the group should have discussed whether or not they agreed with each reading. All of the citing for the literary review was done correctly. However, the page numbers for the references were only present on a few references. The setup portion of the lab describing the networking of the machines was non-existent. The group did not indicate how they configured networking on their virtual machines. The table containing the penetration testing tools was very good. The group discussed which tools covered multiple layers, and also their reasoning for covering multiple layers. The only thing that I disagree with is the tools that have been listed for layer 8. While it’s not necessarily wrong, I think there should be some explanation present for why each site listed is a TOOL rather than a RESOURCE.