Document: Guidelines for IP Flow Information eXport (IPFIX) Testing Reviewer: Joel M. Halpern Review Date: 15-Feb-2008 IETF LC End Date: 26-Feb-2008 IESG Telechat date: N/A Summary: This document needs some additional work before publication as an informational RFC. I would particularly recommend considering addressing at least the first comment below prior to RFC publication. I would also suggest that the test descriptions need some clarification as described in the technical section below, particularly items 5 and 6. Comments: Conceptual: 1) While the document is being published as an information RFC, the wording of the abstract and introduction make it seem that this document is actually defining conformance to the IPFIX RFCs. The IETF has generally carefully steered clear of defining such conformance. So, while publishing a useful test suite is probably a good idea, I strongly recommend fixing the wording of at least the abstract and introduction to make it quite clear that these are not mandatory tests, and that these tests do not define conformance. Related to this, please do not assert (in section 3) that passing this test suite constitutes conformance to the IPFIX architecture and protocol. (Among other things,test suite passage proves nothing about architectural conformance.) Technical: 2) In the terminology section, an Observation Point is defined simply as a place where packets can be observed. An Observation Domain is a collection of Observation points. Then, in the middle of the definition of an Observation domain it say "In the IPFIX MEssage it generates..." but up till now none of the things that have been defined generate IPFIX messages. It is possible that the "it" in the quote is supposed to be the "Metering Process" mentioned in passing earlier in the definition. But the English grammar does not lead the reader to such a conclusion. Later in that same definition, it beings to appear that an Observation Domain (which is a collection of points, not a process or entity) is supposed to generate IPFIX messages, since it is supposed to include a Domain ID in the messages it generates. This definition for an Observation Domain needs to be reworked, to avoid confusing the Domain with the Measurement Process which is running in / for / on the Domain. 3) The use of capital "MUST" in section 3.1 is almost certainly wrong. Firstly, what I think that section is saying is that being able to correctly perform the basic tests is a precondition for being able to perform further test successfully. Thats a precondition, not a "MUST". Of lesser significance, this document does not provide any description of what it means by "MUST". We are usually careful about how such language is used in informational RFCs. I think the meaning would be clearer if the real intent were stated. I suspect that some readers of this review may find my concern here pedantic. But the continual use of MUST in the document really, really bothers me. (I hope the next comment helps explain why it bothers me so much.) 4) Then, the test descriptions go on to keep using this language. This is a test suite description document. Simply state how to run the test. There is no need for "MUST". Section 3 should indicate that the test descriptions describe the preconditions and steps that the tester goes through. So section 3.1 would begin "The tester creates one Exporting Process and one collection process, configures the Exporting Process to ..." 5) It is not clear what test steps like ~The tester ensures that an SCTP association is established.~ (Or worse, the actual text which reads "the test MUST ensure that an SCTP association is established." are supposed to do. Is this an instruction to the tester to use network management tools or CLI to verify a connection on both devices? Is it an instruction to perform additional configuration? How does the tester "ensure". A test suite should tell a tester what steps to undertake, and what observations to perform. "Ensure" is not either one of those. 5a) To elaborate on this issue, in the middle of the test step about ensuring that Data Records are actually exported, we finally get a testable instruction, to whit, use a packet sniffer and check that the packets are coming by. 6) I believe I understand how a tester would create templates, for the template test. But how is the tester to create data sets. Particularly data sets with specific properties, such as the padding in section 3.2.3 and 3.2.4? The best conclusion I can come to is that this is a collector test, and that it assumes a packet generator which can generate IPFIX packets. Having such a device in a test setup makes sense. But the test description does not say "configure a packet generator to generate an IPFIX packet with ..." (There are other ways to say this, but there needs to be some description of how testers are expected to create data sets.) 6a) Related to this, I find reading this document rather odd. I have read many test suites for protocols and implementations of protocols. They generally focus on a Device (or implementation, or entity) Under Test, and the framing around that Device. This suite appears to be trying to test two interacting devices simultaneously. That is extremely difficult, and extremely confusing. It is particularly hard because then the tester doesn't have enough points of control to perform the tests and observe the results meaningfully. It is possible that this combined suite is right for this problem. But if so, a lot of explanation of why it is done that way and how the tester is to accomplish his goals is needed. Minor: 7) The abstract is worded as if one could not perform interoperability testing without first running the tests in this document. While having run the tests in this document will presumably increase the chances of a successful interoperability test, they are not an inherent requirement for such testing. 8) I would probably be inclined to lighten up the Motivation section a bit. Or even remove it. I don't think we need to explain why test suites are useful. If we really need a motivation section, then it should explain something about why it is particularly complex to test IPFIX implementations (if it is) and thus why the IETF feels it is particularly useful to publish a test suite ourselves in this case. 9) The definition of Transport Session is actually the definition of various kinds of transport sessions, and how they are identified. Could the definition start with an actual definition please. (I.e. the communication over time used to carry X between Y and Z? Or something.) 10) As an editorial matter, most testers I have worked with strongly prefer if every step in a test is explicitly separate and named / numbered. That way, they can check off each step as it goes. So the beginning of 3.1.1 would be i) Create One Exporting Process ii) Create One Collection Process iii) Configure the Exporting Process ... 11) It is particularly odd to see a set of Stress/Load tests that simultaneously claim to be measuring conformance and to not specify the level of Stress / Load. Having a description of how to perform load tests is useful. But its relationship to the other tests is confusing. (This obviously is helped once we no longer claim that this is a conformance test.)