April 16, 2025

10 thoughts on “TECH 581 W Computer Network Operations: Laboratory 2, Team 4

  1. Team four has posted a lab that is much the same as their first lab. The MSO issues seem to have disappeared, which is an improvement over lab one. However, due to my own inability to remove MSO issues I am hardly in a position to criticize that aspect. I did notice how long it took team four to update their lab after the initial issue which does seem to call into question the ability of team four to present a well formatted lab on time. Beyond that, team four presents an abstract that meets the requirements of the syllabus as per length, but that abstract reads like it was copied and pasted from the lab design document rather than an original idea, but does cover the steps of the lab as well be completed. Team four’s literature review is the same as their lab one, as well as most other literature reviews submitted for both labs from most teams. The Literature review is not cohesive, and is a list of reviewed readings with APA citations, and answers to the questions as listed in the syllabus. This is not a scholarly literature review, and does need to be improved for the future. The methods section is almost nonexistent and does not meet the requirements of being a scholarly explanation of the strategy and techniques used to complete the lab. After the methods section there is no listing of findings and answers and the lab just lists the sections of the lab design document. The active recon table is complete and easy to read and understand. The technology column does not fit with the other tables presented in other labs, and I question the usefulness. I also question the placement of Quid Pro Quo on the table at all, it does not make sense to me. I also question the placement of HTTP based tools at layer 6, HTTP is an application layer protocol. Finally, I question the placement of SIP tools at layer one. SIP is by no means a electrical or light based transmission medium, but rather a Voice over IP control protocol running at layer 7. Team five took a different stance on the fifth layer of the TCP/IP model, adding in that layer. I disagree, until the DoD changes the model, there are four layers. The SCADA table as presented by team four is better than the one presented by team one, and less confusing than the one presented by team three. However that does not make it correct. I still believe that each SCADA protocol should have been broken into individual tables, each comparing the selected protocol to the OSI model. Team four then breaks down each layer of each SCADA protocol each of these explanations seems to be a direct copy and paste from the specified sources rather than providing and synthesis on the topics. I do agree with their conclusion on how active recon does seem to favor layers seven and three. I do not agree with them on active recon favoring layer six, this seems to be on faulty placement of protocols on the exploit table.

  2. The group kept saying “student team”. Try to avoid saying that, instead say “group”. The literature review reads very much like a list. This group needs to make more of a cohesive literature review. The group wrote the literature review like they are just answering the required parts and not really comparing the papers to each other as well as not taking a stance on whether they agree or not with what the authors have to say. Some of the group’s literature review did not cite anything from the papers yet they were put into the bibliography. Not all pages were cited, as stated before the group only answered some of the required questions. It was obvious that different people wrote different parts of the literature review. Time needs to be taken to make sure that that literature review is cohesive and flows together smoothly.
    The group keeps stating that the tools will be placed into the OSI 7 layer model, yet the tools were put into the OSI 9 layer model. To me, this table looks like the group just took their table from lab one and just deleted some rows. I do not agree with many of the tools the group placed in layer 8. I don’t think that getting someone drunk or high would be an active reconnaissance tools, nor many of the other tools that were placed in layer 8. I do not agree with fire alarms being an active reconnaissance tool. I would like to have seen the tables split apart for the different models being compared to each other. The chart was very long and hard to compare to the group’s descriptions. If the chart was separated the descriptions could have followed the separated tables. I find it very hard to believe that the group had no issues with this lab, considering the questions that the group asked in a discussion. This group had no issues or problems in their last lab report either so they must be doing something different from the other groups. Any problems or questions that the group had during the entire process should be put into this section and discussed thoroughly. The tables should be placed into the results section. The methodology are the steps of the process for performing the lab, this would include any research done or any items performed in the virtual environment. The tables made would be the results of the research putting them into the results section. The group’s results section sounded like a summary or abstract of the lab exercise. The results section should be the results, answers to any questions, and any discussion the group wants to include. The conclusion needs to be more in depth, not a summary of the lab.

  3. The abstract simply restates the requirements and objectives for the lab. While this isn’t incorrect, it could use a little more cohesion and synthesis with the work that was actually performed in the lab.
    The literature review is simply a list of the assigned readings with a lengthy summary. While none of the articles are compared and contrasted against each other in the broader context of the topic at hand, the lab exercises are taken in to consideration. Some of the tie-ins to the lab exercises were good starting points for further discussion and research but ultimately fell short.
    The methodologies section discusses the different parts of the lab but lacks detail. One part of the lab exercises was to test the active reconnaissance tools after creating the table in our lab environment but no mention is made of this at all in the methodologies section. The latter parts of the methodologies section discusses the alignment of the OSI model and SCADA protocols but there is not mention of how this is going to be done. It would aid the reader to know how, specifically, this was being done.
    The discussion on why the physical fifth layer should be present is probably the most detailed of all of the group labs but missed the mention of two of the main proponents of the five layer model, Comer and Stallings. The conclusion that the data link layer and physical layer should be separated implies that they would be in the same layer were they not. That alone should have some discussion around it as to why the data link layer doesn’t server this function. RFC 1122 (http://tools.ietf.org/html/rfc1122) has some good information on the layers.
    The treatment of the SCADA protocols in the table was difficult to read. Instead of putting all of the protocols in the same table, it would have been easier to read with each protocol treated separately. Putting the SCADA protocols together does show the relationship between the components and possibly some avenues of attack or, at least, areas that could be focused on for attack. It’s likely that an organization would be standardized on a single protocol for their SCADA infrastructure so combining them wouldn’t have an immediate advantage for the would-be attacker. The descriptions of the various protocols are unnecessary as most are quotations from the vendors. For brevity’s sake it would’ve been easier to just include links for further information on the protocols.
    The results section mentions valid findings on the concentration of tools in the various layers of the OSI model but still lacks any mention of testing the tools in the test environment which was a major part of the lab exercises. This section is also the first to mention the anonymity principle that was supposed to be considered in the first section of the lab. The conclusions also lack detail and synthesis between the tasks performed. Simply restating what was done doesn’t suffice, the findings and results should be reviewed and discussed.

  4. Team 4 did a better job this time using their abstract to describe what they were going to do in Lab 2. One thing I noticed that seemed odd is they referred to themselves as “the student team”, I think “group” would have been fine. The reviews of their articles were more comprehensive than their last review lit; however improvement in writing skills would make the writings easier to follow. I’m surprised they didn’t have any issues with this lab. Once again I found their tables to be confusing, not sure what the column for technology was supposed to explain. In my readings I have not heard the OSI 7 layer model referred to the OSI 9 layer model? The formatting of the tables in part 2a was very confusing. The methods section should be the steps of the process for performing the lab; this would include any research done or any items performed in the virtual environment. Their methods section read more like a summary of the abstract. Just like the methods section the results section read more like a summary of their process. The results section should be the results, answers to any questions, and any discussion items they may have. The conclusion was very light and should have contained more of their analysis of the overall process. Their conclusion seemed to be a summary of the lab.

  5. Overall the formatting was done better this time around. So starting with the abstract I found that there are a couple of point that are repeated and almost sound repeated. It is almost as if the first paragraph was rephrased. In the future try and make sure the point is not redundant but it is clear. I was able to understand what the lab was about and ready to see what team 4 had for their literature review. Upon reading the literature review I still found it broken into different pieces. From having talked about what a literature review actually is my understanding that it is a cohesive collaboration of all the literature that was read and then taking the subject or subjects of all of them and then discussing them together to find a common area. Then to discus points of each piece of literature and included some different view points that may contrast with one piece but agree with the other. While looking more closely at the groups reviews I only noticed a comparison of the pieces of literature to the actual lab or class it self only in some of the reviews. Next the group goes on to discuss their methodologies and what is going to happen within the section. The part that I miss was between the TCP and OSI comparison. I did notice that they put in a table just like in the first lab with attacks but did not notice the comparison with TCP/IP model and the OSI model. This was the whole first section yes they did take the active resonances and pair it with the OSI model but left a big part of the TCP/IP. Next the team goes on to comparing the SCADA protocols and now they included the TCP/IP that was needed for the first part of the lab. One thing that can be questioned is the placement of where the team put the layers for each of the SCADA protocols and what their decision behind them was. Some of the teams have some at least three of the same protocols but place items differently. Just understanding why they put the layers where they did would be a help, and give the reader a better understanding from their perspective and why the tables should be formed in this way. The team then goes into detail and describes what each protocol is and what the purpose of it. After the methodologies the group goes into their findings. It seems that the group was learning a lot and then it kind of slowed down when it was down to the SCADA protocols. Lastly they concluded the lab very briefly. This seemed really short for a conclusion and some of the information could have been used in their findings. In the end the team did do what was needed but next time they will be able to tweak their lit reviews to be more cohesive and have a stronger ending that will not fizzle out.

  6. I thought this overall to be a well thought out lab write up. The literature review made specific mention of relevance to the lab exercise for each article. The tools table was cleanly laid out, with discernable consistency in the tool layer classifications. The SCADA stack tables were very nice, and easy to understand: a real strong point in this write up. Also, the discussion of SCADA protocols was ‘very’ detailed. Finally, it was nice to find a discussion of some of the logic used to classify the ‘active reconnaissance’ tools in the ‘results’ section. I believe this is also an area where this team stood out, as some other teams did not detail this process in any way.

    However, a number of obvious deficiencies existed within the scope of this write up. Conspicuously absent is a mention of any type of anonymization techniques. While I do believe the issue was examined, due to inclusion of such anti-forensic tools as ‘Clearlogs’ in the tool chart, I find no discussion of the concept in general. As noted in other lab teams write ups, some of these techniques are general concepts, and not necessarily addressed fully by a tool listing alone. This lab would benefit from a short discussion in the ‘results’ section of the what makes a tool an anonymization means, or at the very least some sort of descriptive label in the tools table would be in order. I also noticed the somewhat confusing ‘Technology’ column remained from last week’s tool table. I would reiterate that while the additional column for descriptive purposes is a nice idea, the label is a poor choice due to its collision with the McCumber cube ‘namespace.’

    As noted with other groups, I take exception to the inclusion of offline password cracking tools in a list for ‘active reconnaissance’ application. Password cracking is in nearly all cases done on previously collected data, mostly ‘sniffed’ off of the network with passive means. I cannot rectify this type of program to any ‘active’ role, even by stretch of the imagination. Of course, a logically constructed discussion of ‘why’ you chose to include these types of programs, which is not present, might provide reason for me to reconsider my conception of ‘active’.

    Finally, a few problems were noticeable within the SCADA discussion. While the write up in this area was extensive, it seemed to me to be more of a random ‘information dump’ exposition, rather than a concise description. For instance, in the discussion of DeviceNet, why was it mentioned that “a high pulse… represent[s] a logical 0 and a low pulse … a logical 1” (and does really belong under a discussion of the ‘link’ layer proper)? If the goal was to compare physical layer signals, Ethernet uses an interesting system of Manchester encoding which is ‘far’ more intriguing than a simple pulse system, yet this in not mentioned. Additionally, the inclusion of the bit fields of the DeviceNet protocol seemed spurious, for reasons similar to the prior point. Furthermore, I could not locate a discussion of SCADA vulnerabilities, which was one of the primary concerns of this lab exercise. I would recommend condensing the information in this section, and then addressing points which pertain in a more direct way to SCADA security issues.

  7. Team four’s offering is much improved over last week. There are some important general issues that need to be addressed up front. A literature review is NOT a list of articles. Relate them to each other and to the task at hand in narrative form. Evaluate the work. Is it good, garbage, useful or a waste of time? Why? Even though you are directly discussion the work, citations are still necessary. Please have a third party proof read your work.

    There are some specific points in the literature that need to be addressed. When discussing the article About Penetration Testing the group states that because the author sites his own previous work there is a lack of research. However, this may not be the case. What research did the author do for those articles? Why duplicate the effort? In Automated Red Teaming: A Proposed Framework for Military Application the group says, “This article could be used in our lab environment to show how an automated simulation could be set up that would use the red teaming concept to expose vulnerabilities and weaknesses in a network or computer and then use the results to determine what would be the best plan to reduce those vulnerabilities or weaknesses.” But isn’t that the point of red teaming in general? While discussing Geigick and Williams’ article, the group states, “In this selection of vulnerabilities, I believe that the writers greatly reduced their scope on this study of vulnerabilities”. Can you clarify this for me? Does this paper lend credibility to what we’re doing? Would something like what we’ve done with the table be useful in Godefroid’s process? The group quotes McDermott as saying that attacks cannot be patterned. Is he right? You go on to say that his abstract is too short, but this is not really a critical fault.

    The group’s methods and results section was thin. In your results section, you call for a five layer TCP/IP model, but your findings section doesn’t strongly support it. Do you discuss the TCP/IP stack at all beyond this? You mention anonymity tools in the abstract but never discuss them again. You never really explain the reconciliation of the SCADA protocols to the OSI model, you just tell me about them. The question asked was not why the physical layer should be separate, it was does it exist? I do think the group’ Kinetic layer active reconnaissance tools are interesting. It shows some thought. I would have liked to see that developed more. In the deviceNet section, you talk about CAN. What is it? Is EtherNet/IP really any different from TCP/IP?

    In the issues section the group states that there were no issues at all. I don’t believe it. The conclusion doesn’t tell me what the value of the lab was. Did you get anything out of it?

  8. I think that group 4’s write-up for lab 2 was good overall. The abstract for this lab was adequate in terms of length, but the content seem copied and pasted. Each sentence started or contained either “the team” or “the student team”. The abstract might as well have had bullet points. The literary review was good. Group 4 answered almost all of the required questions. The group did discuss how the reading related to the lab, but did not discuss whether or not they agreed with each reading. All of the citing for the literary review was done well and the page numbers for the references were also included. However, I’m still a firm believer in citing too much. I do not think you should cite a source eight times in one paragraph, even if it is a semi-large paragraph. The table containing the penetration testing tools was very good in many areas because they added upon the many tools and techniques that they had added in the first lab. However, I still think that many tools and techniques included are not appropriate or already covered by existing entries. I think the group could have gone into more depth about why they chose the 5-layer TCP/IP model. What about the DoD model? How is that not correct? Also, do these layers match up exactly with the OSI model? Or is it fuzzy where layers like the session and transport layers meet? When dealing with SCADA, what about the Kinetic layer? The issues and problems section could have had a lot more depth. The conclusion for this lab was weak at best.

  9. I reviewed the lab report for Team 4 and found that I agree with most of it. I do think that needed to take the time to proofread the document before submitting it. They would have also done well to run spell check grammar check.
    Unfortunately, due to my work schedule the past few days I was unable to furnish a more thorough review.

  10. Team 4’s OSI layer Model Exploit Table; the first row “Get them drunk or high”, perhaps a better use of words would have better suited this row. The following two rows, “Plant an insider” and “Espionage” that is pretty much the same thing they even fall in the same McCumber cube coordinates. “Scopoamine”, and “Sodium Pentothal” would most likely all count as “Torture” and have the same McCumber cube coordinates. Interviewing terminated employees, that is a great idea. This one might be odd, but can you give an example of how damaging computer equipment to discover who the contractor information be that much useful? Even if you discover the contractor information are you then going to bribe a possible employee, how maybe or may not be the responder to that incident? The chart is structured very well, which makes it easier to read and understand unlike the chart from lab 1. Overall the reviews of the readings were good; they offered more detail then the first lab.

Comments are closed.