April 19, 2025

9 thoughts on “TECH 581 W Computer Network Operations: Laboratory 2, Team 1

  1. Team one did a very good job in their abstract explaining what was going to be done in the lab. The Abstract met the requirements of the syllabus in terms of length and explanation. The introduction to the lab, as found in the peer review section did at first appear to be some what redundant but concluded by explaining how lab two would fit into penetration testing seemingly for someone who had not first read their lab one exercise. The thought of an introduction is something team two has not considered, let alone one that allows the lab to stand its own, so I thought that was a good touch. The literature review itself while covering all of the topics of the syllabus, and being more cohesive then team one’s first literature review still did seem to read like more of a listing of the articles rather than a unified view of the literature in question. The methods section did leave much to the imagination, and was not a good overview of the strategy and technique to complete the lab. The two paragraphs simply listed the steps that were to be completed in a fifty thousand foot overview. While this does seem to be the general way methods are going to labs in the class that does not make scholarly in nature. There needs to be more. The first grid or taxonomy on active recon exploits does seem well balanced, containing multiple exploits per OSI layer. I do however question the inclusion of Google in layer 6 and kismet in layer 5. Does Google operate at the presentation layer? And is Google and ACTIVE recon tool? Also, Kismet is a 802.11 WiFi sniffer and cracking tool which would seem to run at layer 2. Team one’s second table, comparing OSI model to TCP/IP model, as well as 5 SCADA protocols is nothing more than confusing and very difficult to understand. Breaking each SCADA protocol into its own table combined with the OSI model would make much more sense and be much simpler to understand. IN fact I found it almost impossible to obtain any useful information from the table. Team one’s explanation of the TCP/IP model explains what Stallings views on the model are, less complex then OSI, but does not say anything about how the TCP/IP model is designed for TCP/IP and the OSI model is technically protocol agnostic at all layers instead of just the network access or physical. This was not required as part of the lab, but does seem to be a logical conclusion to draw. The explanation of the possibility of five layers in the TCP/IP model is complete and seems to agree with others in the class. Team one’s explanation of MODBUS does cover the many iterations of MODBUS, including MODBUS TCP, but doesn’t mention TCP port 502 anywhere. This seems like a large error in terms of the point of the lab. Team one’s explanation of DNP3 is complete, but they are missing a heading for DeviceNet which left me wondering if they missed it entirely at first glance. Once found, that explanation seemed complete as well. I agree with team one’s technical conclusions on the TCP/IP model, since until DoD changes the model it does officially have only four layers. The best way to enhance the next lab would be the inclusion of better methods. They do not however need to change their methods. Team presented a complete lab as per the syllabus.

  2. The literature review was a little better than last lab but still reads like a list. Each paragraph, though it is taken in context of the lab any maybe the article before it, still is treated almost wholly individually. Each article’s paragraphs sounds more like a summary. The in-text citations are a little backwards, they’re supposed to be (Author, Year, Page). The handling of McDermott’s paper on penetration testing is a little light, his model of attack trees is important when assessing risk from a defensive perspective and planning an attack from an offensive perspective. Choo’s article on creating an automated red teaming framework describes a computer system for running simulated, automated attack, the literature review mentions that this is what is also being attempted in the lab assignment. Automation hasn’t yet been a part of the class, I believe that we’re instead working towards a model that emphasizes cognition and perception of results rather than an automated approach. This is what allows a penetration tester to test the parts of the McCumber cube that are not technology, that involve people and processes.
    The methodologies section is seriously weak. This lab was the first one that allowed us to begin using the tools found in the first lab. I was hoping to see a methodology about selecting various tools from the previous lab that fit in to the active reconnaissance role and pit them against the virtual lab environment that was created. The first paragraph of the methodologies only describes matching the active reconnaissance tools with the OSI layers that they exploit.
    The findings section also lacks depth. Instead of a description of the tools that were used and how they operated in the test environment, there is only the table with the tools and their corresponding OSI layers and McCumber cube positions. The list of tools contains a lot of tools that are well categorized, the only ones I would seriously question are the layer zero tools that are more objects that would be the targets of exploit tools rather than exploit tools themselves. Part two’s table is very confusing. According to the methodologies this table was going to align the OSI, TCP/IP, and SCADA protocols. The table looks like it was a few different SCADA protocols all combined into one big table which makes it difficult to figure out which protocol is being discussed. The descriptions of the various SCADA protocols were much more helpful than the table and added some good detail to the breakdown of each protocol’s components and why they were categorized that way.

    The issues section is interesting. Each “issue” that is mentioned was one of the parts of the assignment. I suppose that it’s good that parts of the assignment were difficult but more specific problems would’ve been more constructive for those attempting to recreate this lab. Finally, the conclusions are a little weak, simply restatements of some of the conclusions. One thing that’s missing from this lab report is any mention of anti-forensics methods or any tools to obscure the source of the reconnaissance traffic.

  3. The group starts off with an abstract that describes both parts of the lab. The group did miss a couple of things in the abstract though. First they did not mention the literature that they had to read and review at the beginning of the lab. Second the group did not mention describing each of the layers of the SCADA protocols at the end of the lab. Next the group did a literature review on the eight articles that were assigned with this lab. The group recaps what was done in the first lab and how that lab ties into this lab. I do not think the group needed to do the recap in this part of the lab report. This part did not seem to fit. Then the group starts to talk about the eight articles given with this lab. They do a decent job at explaining what each article is talking about and how it pertains to the lab, but they lacked in other parts of the literature review. In the literature reviews the group did not mention the research question, the research method, the theme or topic of the article, any type of supporting data, and they mentioned only a couple errors and omissions. I believe that the group could have done a better job in the literature review but cutting back on the description of the article and more on the research and meaning behind the article. Next the group wrote up a methodology. They described what they did in each part of the lab and gave a very brief description of each part. Next were the tables that they had constructed in each part of the lab. The first table compared each of the active reconnaissance tools they had researched to the OSI 7 layer model and placed each one in McCumber’s cube. The table showed that there was a strong amount of tools in the application and network layer and a good amount of tools in the presentation and session layers. The other layers did not have that many tools in them. The next table showed the alignment of the TCP/IP model and the SCADA models aligned to the OSI 7 layer model. The table was a little difficult to follow because you could not compare all the protocols to each other that easily, because the protocols were on their own tables. Next the group talked about the TCP/IP model and described each of their layers. The group does a nice job of comparing the TCP/IP model and the OSI 7 layer model. They argue that the TCP/IP model should only have four layers using the argument that it is the most widely accepted of the models. Next the group discusses the SCADA protocols. They start off with a series of MODBUS protocols. The group does a good job at describing what the MODBUS protocols are used for and what they are, but they lack in the details. In the description of each of the layers of the protocols they only mention were the layer of that MODBUS protocol lines up to the OSI 7 layer model. The group does the same with the DNP3 set of protocols as they did with the MODBUS set. Next the group goes into issues they had with this lab. In this section they do not mention the issues that they had with the lab as much as what they did at each step of the lab. Last the group stated the conclusion. In the conclusion they briefly mentioned the researching of the active tools, then goes into explaining why they argued with keeping to the four layer TCP/IP model and not the five layers.

  4. There were a number of good things noted about this lab write-up. I thought the literature review to be fairly well done, with a nice synthesis of the important points, and a tie-in to the lab exercise attempted for the majority of the articles (although I think the term ‘cop’ to be poorly chosen). I also thought the active reconnaissance tool table to be of a nice format, with a large number of tools included (with caveats, noted later). The discussion on the TCP/IP was good, with some interesting discussion of the Department of Defense model versus the OSI approach. Also, for the most part, the SCADA protocols presented were discussed at length. In addition, a good number of external references were used to support the discussion on TCP/IP and SCADA.
    A number of flaws were noted, however, some fairly serious, others of less concern. Foremost, a discussion of anonymizaton techniques was conspicuously absent. While it was noted in the ‘abstract’ section that the team researched this topic, no information was presented on the results: if anonymization tools are present in the tool chart, there is no indication that specific tools are designated to this role. Closely related to the prior problem, the team did not mention any criteria used to classify a tool as an ‘active reconnaissance’ tool. Having researched this area also, I am fully aware of the problems with discerning if a tool falls within this particular domain. A definition of ‘what’ you consider an ‘active reconnaissance’ tool to be is not only helpful, it is absolutely necessary within the scope of this exercise.
    Since no definition of any of the criteria used to classify the tools into the chart is defined, it is even more perplexing to discover such tools as ‘googrape’ and ‘gooscan’ included in the ‘active’ tools chart. In reality, these tools do not engage the target network at all-they rely on public data indexed by the Google search engine. I would ask: if tools for digging up public data (equivalent to looking in a phone book or checking in the public library for periodical information) are considered ‘active’ reconnaissance tools, what would you consider to be passive? I also noticed such tools as ‘whois’ and a ‘John the Ripper’ variant included in this chart. The ‘whois’ addition suffers from the ‘absence of criteria’ also: it might be possible to justify this one (may engage the target network) but without any explanation as to why it is included, I believe it to be in error. Certainly, ‘John the Ripper’ is much harder to justify, as it works ‘offline’ to crack captured data files-to my knowledge there is no way to use ‘John’ to even connect to a network. This is most certainly an error caused by not being aware of the capabilities of a tool. There are many more inclusions that I would term ‘dubious;’ many of the Google based tools I would consider passive (therefore, categorically wrong), nearly all of the ‘*-crack’ variants I would consider misplaced, and many of the DNS tools I find extremely questionable without some justification presented. I would also consider a DNS tool in Layer 3 to be out of place (although exceptions exist; see team five’s review), as DNS is a layer seven protocol: it may ‘resolve’ a host to an IP address, but this comes back as payload information inside TCP/UDP encapsulation. I also was unaware that layer six of the OSI model is the ‘Penetration’ layer. Finally, I would question the inclusion of RTUs and PLCs as ‘active reconnaissance’ tools: how would you propose to use these devices in this role?
    A few more minor flaws proved distracting while reading the document. A number of the headings for the SCADA protocols appeared missing, which marred an otherwise easy to follow layout in the rest of the report. The general ‘MODBUS’ heading makes some sense, but the single ‘DNP3’ with DeviceNet falling under it confused me a bit: did you mean to imply DeviceNet as a subset of DNP3? If so, I am certain that this is wrong. I also notice that DeviceNet is not included in the SCADA stack charts. Finally, the SCADA charts were just downright confusing. It would have been better if the SCADA network stack tables were detached from each other; and, did empty areas in the charts imply continuous layers, or areas not addressed by the protocol? I could find no consistent pattern in this area (i.e. application listed multiple times under TCP/IP, but not under MODBUS TCP). Neglecting organizational details like these can render a reasonably complex report impossible to decipher meaning from.

  5. The abstract is strong they clearly identify everything that is going to happened in this lab. There are a few grammatical error right at the beginning such as, “This is analogy is about…” Bishop’s analogy for penetration testing is good for beginners and non-technical people to understand. However there is more to the article then just stealing a car. He does not have a research question but he does talk about penetration testing, not how to do it. McDermott on the other hand is more technical in his paper and does provides more detailed information about penetration testing. All six of steps of what McDermott talked about is important no just the first two which is defining goals and performing background study.
    Matt Bishop second article, Security Plan for Red Teaming is about planning, which is a great idea when doing just about anything. As with the other article, Bishop do not use supporting resources. He does seem to reference is own knowledge as if he were an expert in the field. If he is an expert then chances are others experts share his ideals and can reference each other for support.
    The team does a good job relating back to the lab from the reading. In the methods the group talked about gathering active reconnaissance tools and put them in a chart fitting the OSI 7 layer model and the McCumber Cube. They did this and was nicely charted which made it easy to read. This one is better then the previous chart from lab 1. The team then goes on to talk about the Department of Defense’s model of TCP/IP model, which is also accepted by Cisco. They break down the model’s layers, Application, Host-to-Host, Internet, and Network Accesses layer. The host-to-host description is a little slim. They mention figure 1.3 three times and 1.2 but did not locate in the paper any table or chart labed figure 1.3 or 1.2.
    The team goes on to talk about about the SCADA and SCADA protocols which are, MODBUS, MODBUS+, MODBUS TCP, DNP3, and DeviceNet.

  6. Starting with the abstract, the group says “A comparison could be made…” but I’m uncertain what it is you’re comparing and whether you actually ever make the comparison. Do or do not. There is no could.
    In the literature review, the group tells us that the point of this lab is to see where active recon tools can be used against SCADA. Are you sure that’s the point? I got something slightly different. The group goes on to tell us that “the first lab leads into the second lab” and that “the labs teach penetration testing”. These are obvious points to the intended audience and should not be included. It makes the paper look like the group is more concerned about word count than substance. When discussing JP McDermott’s article, the group equates the lab assignments to the author’s penetration testing planning steps. I don’t think what we’re doing equates to defining goals and performing a background study. It’s a long reach to get there. The group criticizes Matt Bishop’s lack of source material for Security Plan for Red Teaming. Is this a scholarly article per se or something else? Is it ok for the Author to give his opinion? I thoroughly disagree with the group’s assessment of both Choo et al and Godefroid. The papers focused on automating very different things. They are not similar to what we are doing in the labs because of the automation aspect. Still discussing Godefroid, the group states that some of the algorithms “looks like complex formulas and codes”. Isn’t that an algorithm by definition?? I don’t understand what Hafle’s paper had to do with Godfroids. Is it that they both had colors in the title? Godfroid discusses an automation process for software testing. Hafle talks about different modes of testing. Neither actually talks about ethics. In general the literature review attempted to discuss the articles and (often erroneously) tie them to the labs but the evaluative process was missing. Is the article good? Why? I find it hard to believe that the group actually read all of the assigned material. I suggest having someone review for grammatical errors and awkward writing style. For example, Matt Bishop is a person, not an article. The group refers to him as an article repeatedly.
    In the methods section, the table format is very hard to read. I find some of the tool placement suspect. Ettercap for example. Can one tool be used attack more than one layer? One set of McCumber coordinates? You team mentions tools for anonymity in the abstract but never discusses them.
    Discussing the TCP/IP model, the group had a nice find with Stalling’s reasoning. Unfortunately I can’t verify it because no page number is listed. What do people other then Stallings think? Is there why do you think we did this exercise? Your answer about the physical layer is WEAK. The group agrees because most people say so?
    In discussing SCADA the group lists a very small set of applications. Is that all SCADA is used for? Where did you get your extensive knowledge of MODBUS? There wasn’t much sited for this section, and I had a different understanding of the data flow. In the section about DNP3, the group says that the model does not follow OSI at all but is based on an IEC standard. Where do you think the IEC got the idea for the protocol? As far as DeviceNet is concerned, does CIP exactly adhere to the top three layers of the OSI model? Are those the only three SCADA protocol stacks?
    The group’s issues section sounds like the lab objectives. Did you have a hard time with the lab in general, or are there specific issues you had in performing the given tasks?
    The group’s conclusion is painfully weak. What did you learn? Was the lab helpful at all? Were your findings in line with what you expected?

  7. Team 1’s lab started with their abstract and describes what will be done within the lab. They then go onto the literature which they did a good job at leading from one article to the other but there was not a lot of confliction and conversation that argue different points. Such as this author believes this view and this other believes this view. How do we know which one is right, what facts are in place or are they just two different methods with one being preferred over the other? This will give the literature review more depth and raise questions by the reader. Getting people actively involved into a lab or paper will spark more thought and something may even come from those and continue a chain of new ideas. Then the team went into their methodologies and what they did within the virtual environment. This point was a little vague in the description of what and why did they use these certain tools to do the testing. It is important to let the readers know why something is being used and not another tool and for what reason. Some times it can be hard when the people working on the lab need to convey the information into a document explaining what they did. But being descriptive will help in the long run be used as evidence and create a backbone for the results given. They then go on and present their tables. The first table is good and can be followed well except for a minor error at the 8th layer of the table. The second table was hard to follow. It could have been broken down into separate tables making it easier to display across word press. Then the team explains the different protocols and does a good job breaking them apart and defining each of them some better than others. One thing that I would like to see is an example of what they are used for and what makes them a target. The team then goes on and explains their issues and the problems they had finding active reconnaissance tools to user. Overall this was the biggest issue out of all the groups is have tools that actively did reconnaissance. This asks the question is the more effective way to attack a system through a more passive approach collecting data and then attack the targeted system? They then go on to their conclusion and give their thoughts on where things lay and some of their findings. The conclusion needed to be worked on a little bit more and some of the information could have been put into the findings section.

  8. I reviewed the lab report for team 1 and found a few errors. For example I believe “any king of penetration testing” was supposed to be “any kind of penetration testing”. In another sentence they wrote, “other had made models”. Should this be others have made models? There were some misspellings such as Appication instead of application. I do think they made a good point that the low voltage from the SCADA network would not likely power the device and therefore there must be a protocol stack on the receiving end to initiate the high voltage required to run the device.
    Unfortunately, due to my work schedule the past few days I was unable to furnish a more thorough review.

  9. The abstract gave a good overview of what was to occur in the lab. The active reconnaissance tools though were not meant to provide anonymity, as this would be the job of anti-forensic tools. I was not sure what the group was getting at when the team stated “A comparison could be made to see what of the past laboratory assignment table and the active reconnaissance tools.” I assume it meant that some of the active reconnaissance tools could be extracted from the table from the previous lab.

    In the literature review section, the group did a nice job tying this lab to the previous lab and connecting both labs to the literature review for the lab via explaining the significance of penetration testing .For the literature review section, I had noticed a few discrepancies. The article Grammar-based Whitebox Fuzzing was almost completely left out of the literature review, for the supporting data only included an extreme overview distributed throughout the literature review section that briefly mentions the term fuzzing and they used codes for penetration testing. What relationship did this article on fuzzing have to some of the articles? How did fuzzing relate to the lab? The supporting information for the article, Three Different Shades of Ethical Hacking: Black, White and Gray seemed way too brief because besides covering ethics, the article explained different types of penetration testing techniques based on their coded color. I did not understand how the group associated good and evil with the different classifications of penetration testing when the team stated that “While all that the group will be learning in can be used for “evil” it can also be used for good.” When the group stated that “The entire purpose is to learn how to properly test our systems and how to protect them“, was this a reference to the article or to the lab, for there was not a transition statement to correlate the article to the lab.

    The method section gave an overview for the remainder of the lab, explained how the active reconnaissance tools were placed into the table, and explained the alignment of TCP/IP model. The method section did not mention the alignment of the SCADA protocols to the OSI and TCP/IP model.

    In the active reconnaissance table there were a few tools or methods that seemed more passive than active. Shoulder Surfing is more passive than active. Wireshark would also be more passive than active.

    In the H2H Transport layer sub section, group 1 stated “Figure 1.3 indicates that the TCP/IP H2H Transport layer satisfies the characteristics of the OSI Transport layer. Also, Figure 1.3 includes the debatable inclusion of OSI session layer characteristics.”Where is this figure 1.3 that the group mentions?

    For the second part of part 2, the SCADA table was somewhat hard to follow and appeared to have formatting issues.

    The group answered the question does the TCP/IP model have four or five layers? They concluded that the TCP/IP model should only have four layers because the Physical layer is implicitly there within the Network layer.

Comments are closed.