April 17, 2025

12 thoughts on “Tech 581W Computer Network Operations, Laboratory 1: Team 5

  1. The group starts out with a short abstract. The abstract quickly explain that they reviewed the pros and cons of penetration testing, tools used, and a creation of a lab environment for penetration testing. Then the group explains that this lab is going to use an approach to penetration testing using the OSI model with two extra layers attached to the model (people and kinetics). They quickly mentioned the table that will be created to categorize the tools found into the OSI model and the McCumber cube. I think that this abstract could have done a better explanation of what this lab is about. Next the group did their literature review. The literature review starts off explaining that the focus of the articles in this lab was to produce various laboratory environments to support penetration testing and an attack-based security strategy. Then the group goes into explaining each paper. The group takes all the readings and compares them to each other under different categories of discussions. They do a nice job of comparing and contrasting the readings this way. The problem that I ran into is that they didn’t compare the readings to this lab except for a couple of lines at the bottom of the review and even then they just explain briefly what they are going to do in this lab. I also didn’t see any were that the group explained the question of the paper or the methodology. Next the group talked about the method of the lab. They just mentioned in passing that they were creating a lab environment. The group did not go into any details on how they were going to set up the lab, what operating systems they were using and what they had to do to configure their machines in their environment. The group did explain that they were creating a table that categorizes a bunch of tools into the OSI model and the McCumber cube. Another step that was missed was the mention of answering the two questions. The table that they produced was set up nicely. They categorized the tools in a way that made it clear that there is a bias. After the table the group goes into the finding of the lab. In the findings the group does answer the question of why the tools fall under technology in the McCumber cube. They state that the tools fall under technology because they exist in a computer system. I say that not all tools exist on a computer but they do still fit under technology. They also answered the question of biasing in penetration testing. They state that because the tools fit under the technology category there tends to be a bias toward the tools than with attacks using tools outside the technology realm. They also warn to anyone that may use this list that the list is only an aid and not a breakdown of the threats and vulnerabilities. Again the group does not mention any were about the set up of their lab environment. Next the group talks about the issues they had. The group mentioned the problem of being able to fit most tools into more than one category, which I think all groups had a problem with. Also they mention the problem of not knowing how many tools to use in this lab. Last the group wrote their conclusion. In the conclusion the group stated that creating this table of tools the group was able to see patterns in the way tools affect different categories of the McCumber cube.

  2. The abstract was well written. It clearly stated what is going to be done in the lab as well as what the group will accomplish in the process of the laboratory experiment. This group put the required readings into different categories. This helped compare the papers to other into the same categories as well as the ease of comparing them to other papers. The group hit on every point that a literature review should as well as clearly citing the paper numbers for the references that they used in their literature review. The group is missing one section, the steps of the process. I think that this section is needed even though every group had the same steps to set up the Citrix environment, providing that the group chose to do that option and not use their own equipment. I like how the group put the table together by grouping all the tools that share the same OSI layer and McCumber cube dimension into one row. This made reading the table much easier than most of the other groups’ tables. The group did not find any tools that fit into layer 4 (Transport) in their table. I would like to have seen some tools in there. This group had some of the best tools in layer 0. Most of the other groups tried to find tools that dealt with technology to fit into this layer, while this group got imaginative with their list.
    The group did not talk about the differences between Ethereal and Wireshark. They did however talk about the other questions that were presented in the description of the laboratory experiment. I agree with their reason as to why some many of the tools that the groups would discover fit into the technology dimension of the McCumber cube. More than likely, when using technology as the attack method, technology is usually going to be the item affected while policies and the human factors will not be affected as much. I find it actually quite calmly to see that this group like many of the others had issues determining the OSI layer and the McCumber cube dimension. The group’s only other issue was the confusion that the other groups had with the number of tools that were needed to be found in the table with all of the information. This group had one of the best conclusions. In it, the group stated that this was a mental exercise. I agree with this statement for a variety of reasons. I also thought that this group had some confusion with this lab and determining how to put it together. Other groups had the same problems, which I think was all part of the exercise of this laboratory experiment. This group realized that it was important to get this research done right and out of the way now for the ease of future labs.

  3. The group gives a fairly decent overview of the lab, but it lacks depth. Do you agree or disagree with the various authors in the literature review? Why? In your introduction the group states that the lab is all about creating the test environment, and you relate the literature back to this point, but never actually mention building an environment. The conclusion states that the lab was about discovering the tools. Which is it?
    I also have some questions and concerns about the results section. In layer 8 you list “saboteur”. Isn’t this really an operator who specializes in physical (layer one) attacks rather than a specific attack aimed at the “people” layer? Several of your layer 0 tools do indeed have a kinetic effect, but I think this misses the mark. Don’t they really attack the physical layer? In discussing your findings the group states that the majority of the tools available do not fall in the technical realm, but your table appears to show otherwise. How do you reconcile this difference? The group states “Any interaction which requires cognition and action on the part of the user would move into the people or processing area of the McCumber cube. “ Processing is on a different plane from people and technology. Did you mean policy? In any case, I disagree with this statement. In order for technology to be used, there must be cognition and interaction on the part of the user. There is no such thing as a closed system.
    On a final note, I would be careful about fact checking. Entrasys does a lot more than penetration testing, and it’s not their main focus.

  4. Team 5
    I think team 5’s abstract was well written and defined what they were going to do in their lab excercise. I liked how they organized their lit review, it made it very easy to read and understand how the articles tied into the lab excercise. The group is missing the set-up steps of the process. Nick did a good job of documenting the process. It would have been nice to hear how team 5 interpreted how the process was set up. I thought the way team 5 put the table together by grouping all the tools together that share the same OSI layer and McCumber cube was personally, very helpful to me. I didn’t see any tools in layer 4, not sure if they missed that or just couldn’t indentify them. I liked the tools they listed for layer 0. Much more creative than ours and the other groups. I agree entirely with their findings, issues, and conclusions. Overall the paper and lab was well done.

  5. The fifth team also presented a complete and well thought out lab exercise. The lab met most of the requirements as per the syllabus. There were no real apparent issues or problems that stuck out at first examination. However, there was one item that could be improved upon. The abstract did not meet the requirements of the syllabus in terms of length. Team five presented a very cohesive and well organized literature review, meeting the requirements set forth in the syllabus. In text APA 5 style citations were included, all articles reviewed and given careful consideration as to where the fit in both in the literature review itself as well as in the lab. This team presented their review of the literature as being divided into two (or possibly three) sub-categories, one dealing with the strength and weakness of penetration testing, one dealing with the creation of a proper penetration testing lab, and possibly one more dealing with the tools used in penetration testing by previous authors. This style of literature review is different than any of the other labs reviewed here, and does provide a very good understanding as to the state of the body of existing literature on the required topics. The methods were detailed as to how the taxonomy was created, as well as how the questions presented in the lab were answered. One lacking item was in the technical setup of the lab in VMware itself, there was no mention of setting IP addresses on virtual machines, as was in all the other labs. The items in the taxonomy agreed with the other labs for layers one through seven and the items in layers zero and eight were well considered and agreed with the labs from team three and one. The questions asked in the lab were answered and those answers also agreed with the other labs. This lab also agrees with team three and one in many of the topics, ideas, and short comings of the literature presented for this lab. The technical merit of team five is easy to judge for the taxonomy as being well thought out, and accurate. Where it cannot be considered is in the VMware lab setup, as that section is missing. The lab approach taken by team five is different than the other approaches, as it more cohesive then team one and team four, and categorized differently than team three. The only real enhancement that could be made in the future is in the abstract length, and making sure all sections are accounted for. The only additional materials that may be needed focus around group communication, just as in all the other groups. Team five’s methods need no changes. The area of most interest to this reviewer is in the conclusion and issues sections. Stating that not knowing where to place Wire shark is a good issue I’m sure all had but was not recorded by any other team, as well as the fact that lab one was a very mentally taxing activity

  6. @Borton. The statement about Enterasys doing business in the field of penetration testing meant only that it was a part of their business. This would be like saying that McDonalds does business in the field of selling french fries but I can still always go pick up a BigMac too.

  7. I feel that this is a very soundly written lab write-up. The literature review is nicely done. Perhaps I am inferring signals which really are not present in literature review, but it seemed amusing that the first ‘vendor whitepaper’ received discernibly more directed criticism than the other papers-prior negative experience with said vendor possibly? I found the literature review closely connected to the lab procedure-well done. I also thought the remainder of the write-up was well conceived. A fairly comprehensive description of procedure was presented, a discussion of results and issues was included, and a nice summary put forth in the conclusion section. Finally, the organization of the tool table was very nice, well researched, and presented with a thorough list of web links.

    Now, for thoroughness, the (few) deficiencies found must be presented. It was noticeable that the procedures for tool discovery and evaluation were detailed, but no mention of the steps taken to construct the test environment was made. In fact, only a generic reference to “creating the lab environment” is found in the ‘Lab Exercise’ section, with no mention made thereafter. What considerations were made when designing the environment? Is it a VMware based system, or were the resources located to create a small ‘real iron’ setup on some isolated LAN subnet? What operating system variants were chosen for the test setup? All these are pertinent questions which should probably have been addressed.

    So much for the procedure critique, as I really wish to address this teams ‘lab question’ answers. With relation to the first ‘technology’ question, I take issue with two ideas presented in this write-ups answer. First, I would argue, contrary to the assertion made, that ‘cognition’ is ‘not’ strictly limited to the domain of ‘human’ and ‘policy/practice’ exploits. Rather than rehash the entire argument again, I invite those interested to examine the review of group two, in which I detail my logic. I say in brief summary: it appears that ‘technology’ can only be effectively employed as a tool in conjunction with the cognition required in its creation and direction-therefore it does not differ substantially in this regard from the other categories. It is then obvious that I will take exception to the second assertion, that an associated property of ‘cognition’ automatically classifies an exploit into the ‘policy/process’ or ‘people’ sectors of the McCumber cube. I would simply say that I cannot agree with this based on the prior point.

    As to the second question, I think the assertion that the majority of truly effective exploits lie outside of the realm of technology is an excellent point. I did not discern a direct reason as to why most tools used are ‘self-selected’ into the technology area, however. I believe the implication in the answer was that in some way, direct human interaction was undesirable within the scope of penetration testing-but no reason was given as to why this might be true. I might suggest that ethical considerations and legal liability are valid reasons not to use ‘human’ and ‘policy/practice’ tools. Again, very nice write-up; I appreciated the scope of the discussion sections.

  8. Overall team 5 has put together a very good document. The document review is very thorough and touched on some points that I missed when preparing mine. I particularly like that fact that the author provided links to the various tools that were reviewed. That is something that our group overlooked.

    I believe that it is a good point that that the author mentions that Gula, 1999 paper is a vendor whitepaper and as such may be biased to support their work. It is also good that the author points out that there is no supporting documented research concerning the areas that he states are often missed by penetration testers, and that these are merely an opinion of the author.

    I like that the author quotes the paper that “the majority opinion is that the only way to properly defend a system against attack is to understand what attacks look like as deeply and realistically as possible”. This is the supporting argument for penetration testing and the very nature of this course.

    There are a few changes that I would make to this document however. For one thing it states in the introduction, “we reviewed the pros and cons of penetration testing”. Perhaps it would have been better worded as “benefits and limitations” rather than “pros and cons”. This document also states, ”As with security and usability it would not be possible to have one without the other..”. Since increasing usability has an inverse effect on security and vice-versa, perhaps it would be better stated that there is a trade-off between the two rather than stating that you can’t have one without the other.

    Some other things I might change have to do with the classifications within the OSI model. Layer 7 of the OSI model has to do with the applications that use the network, such as FTP applications, Telnet applications, web browsers, etc. For example, this paper describes DNS Tracer is shown in the application layer. Perhaps it would be better placed in network layer since it is concerned with tracing domain servers, which are in part responsible for directing network traffic. Another example is PBNJ. PBNJ is a network scanning tool, and therefore likely doesn’t belong in the application layer. I believe it would go into layer three, the network layer.

    Mbenum is a tool that obtains information from the master browser about what services they run, such as Terminal Services or SQL Server. As such I believe it would belong in the application layer of the OSI model rather than the session layer since its vector of attack is the application directly.

    I agree that many of the tools can belong to multiple categories. For example, a tool that scans for open ports on a network and uses that to cause a buffer overrun within an application operates at both the network and application layers. However, its initial point of attack is the network.

    Altogether I believe group 5 has put together a well written and complete document, particularly considering our time constraints. There were only some minor changes that I would make, and those changes are mostly a matter of my opinion.

  9. In the abstract, I agree with what your team meant by “In this lab we review the pros and cons of penetration testing,” Most of the articles just addressed the positive aspects of penetration testing or red teaming, but there were a few that did indeed cover some of the negative aspects of doing this type of testing.

    I noticed that Team 5 also did not include a method section. One of the lab questions had us address the biases of using attack tools, which some groups interpreted as adding our biases where the tools should be placed in the table, while others applied the question to the ability to realistically use the tools in a virtual environment. I also noticed that your team did not mention anything at all on how you set up or modified your virtual environment.
    In the literature review section I partially agree with team 5 when they stated “The primary focus of the literature in Lab 1 is the creation of various laboratory environments to support penetration testing and an attack-based security strategy.” However, the majority of the articles make the justification for the need to use offensive security techniques; hence the justification for red teaming and how it is a good alternative for teaching students about security besides traditional defensive techniques that are the norm in network or computer security programs. Team 5 needed to address the research questions in the articles, providing that they had such statements, the methodology used, how each article related to the lab assignment. The team did do a good job summarizing the supporting data and did point out a few errors or omissions in some of the articles such as the conflict of interest when Gula pointed out what is normally missing in most penetration tests, which implied his company’s penetration tests were free of such mistakes.

    There were a few discrepancies discovered in the section that had the exploits table. The table was missing the technology column, which would explain what particular technology that exploit or attack tool would effect. The Transport layer of your table appeared to be completely missing. . Some of the sections such as the Presentation layer did not contain the required number of 30 exploits. However, I liked the efficiency of your group by placing all of the related tools into one single box. While some attack tools could be used to attack different layers of the OSI model depending upon their functionality, some of the tools appeared to be in the wrong layers. Dumpster diving would go under the People layer of the OSI model, not the Kinetic layer. In Layer 8, your group mentioned a saboteur as a layer 8 vulnerability. I agree that that it would fit but when it comes to the McCumber cube portion of the table, a saboteur would affect availability more than integrity. Was the Kinetic layer for exploits that used computers to physically affect another computer or a system or machine that were connected to a network such as devices connected to SCADA controllers? Technically would anchor knocking out Internet be a physical layer attack, not a kinetic attack?

    In your works cited section, I thought it was good of your group to include the links to the different attack tools.

  10. The abstract was clear and concise on what was going to happen in the paper. They mention the about the reviewing the pros and cons about penetration testing but did not see where they took a stand on if they were for or against. There literature review was a good but more details could have been given. The lab exercise was more then the matrix chart on the 9 layer OSI model. There chart was very well put together, especially when combining the multiple tools together that all fall under the same OSI layer and McCumber Cube Coordinate. This made the chart easier to read and understand. The Links was a great way to others to view were the group received there information to put the tool in the matrix chart. In the issues, the group talks about the tools falling into more then one categories. Most tools are written to be multi-functional. In the directions it does say to put them in the place were the tool would fit the best. The group had a good idea about placing the tools in the area where it has the potential to be doing the most damage.

  11. I think that group 5’s write-up for lab 1 is very good overall. The abstract for this lab was very well written. The literary review was good. Although, the group should have discussed whether or not they agreed with each reading. All of the citing for the literary review was done correctly. However, the page numbers for the references were only present on a few references. The setup portion of the lab describing the networking of the machines was non-existent. The group did not indicate how they configured networking on their virtual machines. The table containing the penetration testing tools was very good. The group discussed which tools covered multiple layers, and also their reasoning for covering multiple layers. The only thing that I disagree with is the tools that have been listed for layer 8. While it’s not necessarily wrong, I think there should be some explanation present for why each site listed is a TOOL rather than a RESOURCE.

Comments are closed.