April 19, 2025

10 thoughts on “Tech 581W Computer Network Operations, Laboratory 1: Team 4

  1. The group’s abstract was much shortly that the requirement’s stated. The abstract was basically a short restatement of the first few sentences of the lab statement. The abstract should be a summary of what the lab report is and what is to be accomplished during the laboratory experiment. The next part of the lab report was the literature review. The reviews of the papers were quite short and did not have all of the required items for a literature review. The group did put citations along with page numbers in the reviews of the papers. The literature reviews did state how they relate to the lab but they were not compared to each other. The next part of the lab report was the steps of the process. This section was lacking for different reasons. First, the steps were not neatly written out for setting up the Citrix environment. Second, there were no screenshots of the process. Last, the research of the tools were not talked about in this section. Other groups went into more detail with their steps of the process of the methodology section.
    The next part of the lab report was the table of the tools and how they fit into the OSI model as well as the McCumber cube. The group had unique tools for layer 8, but all of the tools were not part of different categories, but were rather part of the same group. I have to disagree with the placement of some of the layer 8 tools, not all fit into the storage part of the McCumber cube. Some of the tools did not have the McCumber coordinates put into the table. I felt that the table was not complete and the time of submission. For future charts, it would look neater if the groups were put together instead of a different row for each tool. The one part of their table I thought was nice was that they put what the tool exploits. Not many other groups did this in their tables. The layer 0 did not have enough tools. I think some more kinetic tools that do not deal with technology are needed in the table. The group only had one issue with the lab experiment. This was that there were not enough kinetic tools. This was part of the research that the group was supposed do for this lab experiment. The issues could have been more elaborated on that just what was stated. The conclusion was weak. It looks like the conclusion was just copied and pasted from the abstract. With all of the research and work done with the lab experiment a conclusion should have been reached. This lab report was missing an important section, findings and answering of the questions. Not only were the questions not answered that were given to us but the questions were never mentioned in the group’s lab report.

  2. The formatting of the group’s submission makes it very difficult to follow. There are numerous spelling and grammar errors that make it extremely difficult to absorb the message. Inconsistent formatting and odd changes in voice add to the confusion. I recommend seeking an outside party to edit before submitting next time.
    There needs to be more depth to the content. It’s good that the group attempted to relate the literature review to the lab, but what are your thoughts about the papers? You lay out the tools in a very readable table, but you never really explain what it means. Are there more tools in one area than another? Are certain aspects of McCumber’s cube more heavily attacked? Why? The conclusion states “We did this” but it doesn’t really say what benefit there was to it.

  3. When reviewing this lab the first thing noticed was that the lab could have been cleaned up formatted better. After reading the lab multiple times it almost felt like something was missing. The abstract and conclusion sounded like a reiteration of each other. The abstract almost came off monotone and split up. One thing that made it feel this way is the formatting of the post, I do not believe this was intentional act, but in the future try to go back next time and make sure that not only is informative but put together to the formatting of the blog. The conclusion could have included the students experience and what they gained from the lab. Next the article reviews where nice in that they did relate them to the lab. What they can do to improve them in the future is to not separate how it relates but combine and make the reviews sound more cohesive. One thing that did stick out though was when they stated that there would be no politics, emotions, or fratricide involved with the lab environment. When people work together on any project there will always be human emotion towards the work that is done. Also sometimes things happen within groups even though the group member may not have wanted it that way. Also no matter how hard one may try politics can not be avoided it is just part of human interaction and part of working together. I can see how one hopes that these would not be a factor but it is something that is dealt with in every team large and small. Next they go on to described the components of the lab environment and
    show the operating systems and the addressing they used for each system. They did say they used preloaded operating systems. In the
    future this needs to be clarified as to are they virtual machines, or something different. One thing that can be improved is having a
    diagram of the system and this will help with the presentation and how people view what has been done. The table at first glance does what it is supposed to do but it seems cluttered and overwhelming to anyone reading it. In addition the table formatting looks off the font differs in places and the alignment for each off the cells is also out of alignment. Another thing with the table is not to forget to reference any tools that they have put into their lab. Overall the team did do a good point of explaining what was to be done they also had a good understanding of the subject matter. In the future better organization will help the group get the reader more interested and not distracted by little errors. Every group is not going to be perfect but this is part of our learning and we will be able to improve from the comments that will help each other refine the labs and give a better end product.

  4. The fourth team, like teams one and three, presented a complete lab, however there were some issues that stood out right away. While the lab did meet most of the requirements of the syllabus, the first immediate issue was in the lab formatting. The syllabus clearly pointed out that copy/paste straight from Microsoft Word should be avoided as MSO tags are included “behind the scenes,” and effect the formatting of the lab once submitted through wordpress. It is immediately apparent that advice was not heeded by team four. Team four’s lab is formatted badly in the opening paragraphs, but does seem to improve shortly after that. The abstract however is lacking per the direction of the syllabus. The lab format seems to be that of an under-graduate lab, rather than a graduate student lab report, and lists steps of the process twice, while neglecting to place methods anywhere in the lab report. The issues section does not agree with any of the other lab reports, and that leads it to be suspect. The questions that are asked in the lab 1 guide are apparently missing, and the taxonomy while complete for layers 1 through 7, do not really agree with the other lab reports for layers zero and eight, again leading them to be slightly suspect. What could be considered a methods section is the second listing of steps of the process, and there the level of detail is lacking. While the technical information provided is both complete and matching as per the other labs, no thought is given to how the taxonomy was complete or how questions will be answered. The literature review itself is rather complete, and does aim to answer all of the literature review questions in the syllabus. This is done through an evaluation of the each individual reading followed by a few sentences detailing how it answers the literature review questions, and how it fits into the technical aspects of the lab. What is lacking here is cohesion, each reading analyzed separately, it does not create a well thought out evaluation and analysis of the state of the body of the literature. Team four does seem to agree with the other labs in areas of layers 1 through 7 of the taxonomy, as well as the steps taken to setup the VMware based lab environment, such as assigned IP addresses, as per the syllabus and lab guide. The technical merit of team fours position is hard to judge objectively as the technical aspects of this lab are rather simple, and as with the other lab reports, very much the same. Team fours approach to completing the lab is much like team one’s approach and not much like team three or five’s approach in terms of literature review and methods. Where improvements can be made is in lab formatting as well as better cohesion of the literature review. This team could benefit from a better understanding of the syllabus, and like the other teams, as this is the first lab in this class, team communication. Including a methods section would also be helpful in the future

  5. The formatting of the text in this post contains lots of odd line breaks which make it very difficult to read. Another section contained what looked to be like improperly formatted code from the import as well as randomly capitalizing all of the letters in the title of each article also adding to the difficulty of reading this report.
    The literature review lacked cohesion between the various topics of the lab exercises and instead broke up the reviews to individual summaries of each of the articles with a paragraph afterword of how it related to the lab exercises. A standard Word spelling check of just the literature review revealed 13 spelling errors. None of the reviews contained any sort of opinion by the authors of the report on whether or not they agreed with the stances the articles took. In one of the reviews the stance is taken that many of the tool authors use “overblown rhetoric about the lethality of their tools.” Some examples of this would have been interesting to see along with the author’s opinion on why that particular tool wasn’t as lethal as the author was claiming.
    The OSI/McCumber table has some formatting issues and contains an extraneous column for “Technology” which often contained data that was fairly obvious (though not always) and the text was not always formatted similarly (some centered, some left aligned.) One major omission in this section was inclusion of links to any of the tools. This would’ve helped anyone desiring further information about the tool. Some of the tools listed in the table, “spanning tree attack” for instance, aren’t actually tools, they’re just non-descript attacks that one might perpetrate using tools. For the purposes of this exercise we were supposed to identify specific tools. Another issue with the table is in layer one where describing attacks against WiFi, the data in the technology column switches from WiFi to 802.11. Aren’t these the same thing? Further along in this section a Prism2 card is mentioned as an attack tool. How so? Isn’t this just a wireless card chipset? In layer zero, Phrack is mentioned as a tool against traffic lights. Is “Phrack” referencing a tool or the magazine? Directly following that is “overclock CPU to the point of failure.” Is there a tool to do this remotely (which would be really cool)? “Drastically reduc[ing] fan speed” could be the result of an attack using a tool such as a type of aerosol spray, reducing the speed would be the desired effect.
    The issues section was very weak and, with some of the post’s formatting issues I find it hard to believe that the only problem encountered was a lack of tools for layer zero attacks. The conclusions section was also incomplete. Simply restating the lab exercises and stating that they were completed doesn’t show what was learned or taken away from the lab exercises. Primarily the takeaway from viewing the threat taxonomy in relation to the questions asked by the professor for the lab assignment

  6. Team 4’s abstract was too short and didn’t provide enough summary of what the lab excercise was to entail. The reviews of their articles were a bit short but did seem to tie them back to lab excercise. The steps of the set-up process were indicated but again Nick did such a good job of documenting the process that I think they should have used his screen shots and then provide an overview of what they thought of the set up process. There were apparent spelling and grammar errors as well as formatting errors. The research of the tools was not talked about. The other groups went into more detail. The table of the tools and how they tied together into the OSI model and McCumber cude was hard to folllow. Perhaps it was the formatting. In the future they may be better served to group the tools and layer together to better oranize the chart. Their conclusion was inadequate. One should have been reached given that this was research. The issue section could have been expanded beyond what they discussed.

  7. My initial impressions of this lab write-up: the literature review was excellent. I feel that the literature review did an exceptional job of picking out the key points of the articles, and then relating them to the lab exercise. It is my belief that a ‘literature review’ is intended to examine and evaluate content, not necessarily style or conformance to document standards: in this regard I judged this review to be right on target. I was also impressed with the way the tool chart was composed. The additional ‘Technology’ heading which defined the entity/protocol which was being targeted was definitely a nice touch. I might suggest a different heading rather than ‘Technology’ however, as this leads to some confusion in regards to the McCumber classification system. Perhaps something more general like ‘Target’ might be in order. Additionally, it appeared to me that the tool classification was well researched, and the McCumber classifications done in a logical fashion.

    Now, however, I must address some of the problems with this write-up. The most obvious flaw is the omission of answers to the lab questions. This is extremely unfortunate, as these answers provide one of the primary means by which the lab can be discussed in a ‘positive’ way (i.e. five hundred words arrives rather quickly). Even worse, there is absolutely no discussion of the results of the lab. Equally bad (especially when accompanied by the previous deficiencies) is the simple hand-wave given to the problems section. So then, the question arises: what, if anything at all, is there to discuss about the write-up of this lab? It certainly puts peer reviewers in a difficult position, as most likely the only approach to gaining the required word count is to ramble on about nothing at all, or to nit-pick and blow rather minor flaws (left unmentioned if further material was available to discuss) well out of proportion. I recall that one of the other groups mentioned that ‘fratricide’ in the pen-testing sense was unlikely to be an issue in this lab group setup, but it does make one wonder if other forms of ‘fratricide’ could be encountered because of certain requirements and built-in factors associated with this class. I would plead that in the future, for everyone’s sake, that more attention be paid to ‘what’ should to be included in the lab write-up.

    So let’s begin then, shall we? I found that the lab was poorly transferred and/or formatted with respect to posting on the blog. Obvious issues with word wrapping made the write-up very difficult to follow because of the uneven line length. Additionally, some spaces were missing, especially in the literature review, which made the run-together words difficult to decipher: this was at the very least distracting. Additionally, I found the double ‘Steps of the Process’ heading to be distracting. Furthermore, it appeared to me that when the ‘Steps of the Process’ were spelled out, very little was said about what was actually done. Finally, through the lens of paranoia, the wisdom of posting so much information about the one’s penetration testing setup might be in doubt. In reality, with so much unknown about future lab exercises, is it not possible that the lab teams will be pitted against each other (red team versus blue team configurations)? This might be a substantial disadvantage to this team, as the machines would need to be reconfigured so as the published information could not be used against them (a bit of a stretch perhaps, but something to consider).

  8. In the abstract and in some of the Literature review, the formatting seems to be off and distracting. The group seemed to fail to mention any link statement to the literature review from the abstract. In the literature review they did relate the review back to the lab assignment. Under setting up the environment it is stated that the team used the preloaded operating systems that were on Citrix and only preparations were to create the static IP addresses for the machines. All the virtual machines seem to be consistent expect for the Linux virtual machine. The groups were given Debian and stated as the Linux virtual machine is Linux Kubuntu. Also stated are an Active Directory domain name, DNS domain name, and a NetBIOS name. The netbios name is to which box? Was another virtual machine created to be a domain controller, but as previously stated the only setup was adding static IP address to the virtual machines. Then the username and passwords were given to the Linux Kubuntu and Windows Server 2003.

  9. Group 4 put together a very good document. There are some areas that I particularly liked. One is the Technology column that was placed in the OSI classification grid. For each attack tool they placed the attack vector which justifies the tools placement in that layer of the OSI model.

    Another good point that the author mentioned was isolating the DETER testbed to prevent malicious code from escaping. And, that testbed artifacts that must be taken into consideration when analyzing the data obtained from the tests.

    I agree that “broadening the scope of penetration testing” addresses many areas that may be overlooked in “real world” penetration testing. Since we are conducting our penetration testing in a laboratory environment, the article only serves to explain additional exploits that we may use (or may be used against us) during our experiments. It also gives us some foresight into the issues that we may encounter in real world penetration testing.

    I do have an area of contention with a statement made in this document however concerning the article “Vulerability Tesing of Sofware”. Group 4 relates this to the lab as “…it is unwise to make assumptions about the behavior of the tools that will be used in the virtual machines…”.The fault injection paper addressed software design that may contain vulnerabilities when the environment is outside the scope expected by the program. I believe that article was in our readings as a possible exploit target and not as limitation of testing tools.

    It was a very good description of the DETER lab. I agree that, although we are not going to work with internet worms per se, our labs may exploit some of the same vulnerabilities that internet worms exploit. As such, I believe that having an isolated lab with disposable operating systems is a good idea.

    The article does address the issue that there are not many attack tools for kinetic vulnerabilities.Although there may not be any tools specifically listed for kinetic vulnerabilities, that does not mean that it is not vulnerable. For example, in theory any of the above listed tools that would allow a hacker administrative access to the application controlling the kinetic device may also allow kinetic control of that device.

    All things considered, I believe this document is extremely well written and very detailed. There were only some minor points that I disagreed with.

  10. I think that group 4’s write-up for lab 1 good in many areas but poor in others. The abstract for this lab was very short and didn’t sum up the lab. The literary review was adequate, but did not answer all of the required questions. The group should have discussed whether or not they agreed with each reading. All of the citing for the literary review was done well and the page numbers for the references were also included. The setup portion of the lab describing the networking of the machines was short, missing steps/information and was oddly formatted. The group did not indicate how they configured networking on their virtual machines. The table containing the penetration testing tools was very good in many areas because they applied many tools and techniques not added by other groups. However, many tools and techniques had no place in the table and were already covered by other entries (“getting them drunk or high”?) The issues and problems section could have had a lot more depth. The conclusion for this lab was weak at best. The odd formatting and several grammatical errors definitely took away from the overall professionalism of the lab.

Comments are closed.