The primary intended users of Section 13 Acquisition of Secure Software, are those involved in software acquisition education and training, as well as standards developers, who need additional knowledge on acquiring secure software. While educators, trainers and standards developers are the primary users, buyers and suppliers may also find the knowledge in this section useful. The advice in this section provides ideas for anyone who wishes to use the Acquisition of Secure Software section.
14.6.2About the Sample Language
The notional examples and illustrations that provide sample language are intended to provoke thought. Atorneys have not reviewed them for efficacy. The sample language should be modified and expanded to fit the users’ particular acquisition environment. As an example, the sample statements of work may contain language that may be more appropriate in other sections of a request for proposals or contract as “terms and conditions.” Likewise, any sample “terms or conditions” may be more appropriate in the statement of work.
14.6.3Software Acquisition Education and Training
In using this section, the educator or trainer should first establish a specific or generic acquisition process description that is best suited for their educational environment. As an example, educators and trainers of US Department of Defense acquisition would use the process established in the 5000 regulation series, while other educators and trainers might use a process that is identified in a generally accepted IEEE of international standard. In addition, major sub-processes may be imbedded within the larger acquisition process, such as the (software) systems engineering process. In this case, the educator or trainer may wish to integrate acquisition within the context of a major sub-process. The educator or trainer may also wish to select the larger software lifecycle process.
Once the process is defined, relevant ideas presented in the Acquisition of Secure Software can be mapped and integrated into the selected process. Student materials can then be created by expanding the information using the references provided within each section and relating the acquisition of secure software knowledge to the selected process.
In addition to providing a single repository of knowledge on the acquisition of secure software, this section also provides subject matter for expanded research and other scholarly work. Risk-based approaches, pedigree management, and incentives in the acquisition process for providing secure software are examples of areas that need further exploration.
14.6.4 Standards Developers
Standards developers can use the knowledge in this section to establish new or modify existing standards. As an example, standards on recommended practice for software acquisition [IEEE 1062] or software life cycle [IEEE 12207] may be modified to incorporate ideas presented in the Acquisition of Secure Software section.
Buyers and suppliers can use the sample statement of work or other language provided in this section to the extent that it is applicable to their acquisition. However, refer to the cautions on using the language.
Feedback from readers and users is needed to improve this section and the document as a whole. We welcome feedback of all kinds about the document. It will be greatly appreciated.
[Bell 2005] Bell, David Elliot. “Looking Back at the Bell-La Padula Model,” Proceedings of the 21st Annual Computer Security Applications Conference (ACSAC ’05). pp 337-351, December 2005.
[Epstein 2005] Epstein, Jeremy, Scott Matsumoto, and Gary McGraw. “Software Security and SOA: Danger, Will Robinson!” IEEE Security & Privacy. Vol. 4, No.1, pp 80-83, January/February 2006.
[Fernandez 2005] E.B.Fernandez and Maria M. Larrondo-Petrie, “Using UML and security patterns to teach secure systems design,” Proceedings of the American Society for Engineering Education Annual Conference (ASEE 2005). American Society for Engineering Education, 2005
[McGraw 2006] McGraw, Gary. Software Security: Building Security In. Addison Wesley, 2006.
[Peterson 2006] Pederson, Allan, Navi Partner, and Anders Hedegaard. “Designing a Secure Point-of-Sale System,” Proceedings of the Fourth IEEE International Workshop on Information Assurance (IWIA ’06). pp 51-65, April 2006.
[Schlesinger 2004] Schlesinger, Rich (ed.). Proceedings of the 1st annual conference on Information security curriculum development. Kennesaw, Georgia, ACM, October 08 - 08, 2004.
[Snow 2005] Snow, Brian. “We need Assurance!” Proceedings of the 21st Annual ComputerSecurity Applications Conference (ACSAC ’05). pp 3-10, December 2005.
[Verton 2005] Verton, Dan. The Insider: A True Story. Llumina Press, 2005.
[Vizedom 1976] Vizedom, Monika, Rites and Relationships: Rites of Passage and Contemporary Anthropology, Beverly Hills, CA: Sage Publications, 1976.
1See back of title page, opposite this page, for directions on how to find out more about, contact, or join the Working Group.
2 See back of title page for directions on how to find out more about, contact, or join the Working Group.
3 Then Deputy Director for Software Assurance, Information Assurance Directorate, Office of Assistant Secretary of Defense (Networks and Information Integration)
4The February 2005 President’s Information Technology Advisory Committee (PITAC) Report to the President, Cyber Security: A Crisis of Prioritization, identified the top ten areas in need of increased support, including ‘secure software engineering and software assurance’. The findings indicated: Commercial software engineering today lacks the scientific underpinnings and rigorous controls needed to produce high-quality, secure products at acceptable cost. Commonly used software engineering practices permit dangerous errors, such as improper handling of buffer overflows, which enable hundreds of attack programs to compromise millions of computers every year. In the future, the Nation may face even more challenging problems as adversaries – both foreign and domestic – become increasingly sophisticated in their ability to insert malicious code into critical software.
5 Such software must, of course, also be satisfactory in other aspects as well such as usability and mission support.
6 The National Software Strategy includes educating and fielding the software workforce to improve software trustworthiness (see report at www.cnsoftware.org/nss2report).
7 SWEBOK® is an official service mark of the IEEE
8 By the UK Ministry of Defence and Praxis High Integrity Systems
9 Available at http://www.swebok.org
10 See http://www.cnss.gov/full-index.html
1 The gap in attention to software security in the 1990’s has reputedly led to many articles being submitted for publication today that repeat this early work because the authors are unaware of it.
2 “Vulnerability: Weakness in an information system, system security procedures, internal controls, or implementation that could be exploited or triggered by a threat source.” [NIST FIPS 200]. “A system that allows computer viruses to replicate or unauthorized users to gain access exhibits vulnerabilities.” National Research Council (NRC) Computer Science and Telecommunications Board (CSTB): Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: National Academies Press, 2002.
3During an earlier period a study showed that among a set of major vendors, announcing a product vulnerability was followed by an average 0.6 percent fall in stock price, or an $860 million fall in the company’s value. This article appears in new scientist magazine issue,25 June 2005, written by Celeste Biever http://www.newscientist.com. Today, however, the market probably has already factored in expectations of these announcements.
This and similar questions have been explored at Workshops on Economics of Information Security since 2002. http://infosecon.net/workshop/index.php.
4 See section 4 for more about this and other legal considerations.
5ChoicePoint’s stock falling 20 percent in the period an incident was disclosed shows another potential impact, even though in this incident, losses resulted from deception, not necessarily faulty software.
6 Software safety ensures that the software’s operation remains reliable under accidentally hazardous conditions, while software security ensures its reliability under intentionally hazardous conditions.
7 While not an attempt to have the software community cease its overly broad use of the term “maintenance,” nevertheless, in the interest of better usage, this report generally avoids this overly broad use by using the terms “sustain” or “sustainment” when actions such as adding new functionality are included.
8Several other bodies of knowledge may also benefit from integrating material from this guide, including the Australian Computer Society (ACS) Core Body of Knowledge for Information Technology Professionals, the forthcoming ACM CCS2001 Computer Science Body of Knowledge, the DRM Associates/PD-Trak Solutions New Product Development BOK, and some methodology-specific BOKs, such as SEI's PSP Body of Knowledge Version 1.0, some role-specific BOKs, such as the Quality Assurance Institute's Certified Software Tester Common Body of Knowledge, Certified Software Quality Analyst Common Body of Knowledge, and Certified Software Project Manager Body of Knowledge [PMBOK].
9 For information, though not for software, another security property is closely related to confidentiality and integrity: privacy. Privacy ensures that information about people's identities or secrets is not revealed to anyone who does not have an explicit need to know that information. Privacy also means that only the person about whom the information pertains has the right and ability to authorize changes to or disclosure of that information.
10 In the NIST Special Publication 800-53, Recommended Security Controls for Federal Information Systems, security controls are defined as the management, operational, and technical safeguards or countermeasures prescribed for an information system to protect the confidentiality, integrity, and availability of the system and its information.
11 An emergent property is one emerging only after the system is composed, i.e., a property of the system as a whole.
12Threat: an adversary that is motivated to exploit a system vulnerability and capable of doing so. National Research Council (NRC) Computer Science and Telecommunications Board (CSTB): Cybersecurity Today and Tomorrow: Pay Now or Pay Later. Washington, DC: National Academies Press, 2002. The malicious developer who plants a Trojan horse back door and the hacker who exploits a buffer overflow vulnerability in executing software are two examples of threats to software.
13 In a risk assessment, the likelihood and anticipated frequency of attacks/exploits will also be considered.
1 Recent estimates say the vulnerability-to-exploit window is now approximately 6 days, but this has been changing rapidly.
3 For descriptions of cyber crimes see the US Justice Department at http://www.cybercrime.gov/ccdocs.htm and the ACM Risks Forum archives at http://catless.ncl.ac.uk/Risks
4 Gregg Keizer, “Cleaning Up Data Breach Costs 15x More Than Encryption” (TechWeb, June 9, 2006)
5 CERT/CC, “CERT/CC Overview: Incident and Vulnerability Trends” (7 May, 2003)
6 Paul Stamp, “Increasing Organized Crime Involvement Means More Targeted Attacks” (Forrester, 12 October 2005)
7 Bill Brenner, “Security Without Firewalls: Sensible or Silly?” (SearchSecurity.com, 5 January 2006)
8 Jeremy Kirk, “Password Stealing Trojan Spreads” IDG News Service (PC World, 30 May 2006)
9 Jay Wrolstad, “Trojan Targets Microsoft Word Vulnerability” (Top Tech News, 22 May 2006)
10 Priyanka Pradhan, “New Settlement in Sony BMG Rootkit Case” (CNBC TV 18, 23 May 2006)
11 Sverre H. Huseby, “Common Security Problems in the Code of Dynamic Web Applications” (http://www.webappsec.org/projects/articles/062105.shtml, 1 June 2005)
12 Kevin Poulsen, “Known Hole Aided T-Mobile Breach” (Wired News, 28 February 2005)
13 Steve Kettman, “Soviets Burned by CIA Hackers?” (Wired News, 26 March 2004)
David Hoffman, “CIA Slipped Bugs to Soviets” (Washington Post, 27 February 2004)
14Such as the buffer overruns or race conditions discussed in Section 7 on Secure Software Construction.
15Trying to break software security through testing such as that suggested by [Whittaker and Thompson 2004] has also been rare except among major software vendors and users. This trend has been improving.
16An Easter egg is hidden functionality within an application program that is activated when an undocumented, often convoluted set of commands and keystrokes is entered. Easter eggs are typically used to display the credits of the application’s development team. Easter eggs are intended to be innocuous; however, because Easter eggs are virtually identical in implementation, if not in intent, to logic bombs, many organizations have adopted policies that forbid the use of software that contains easter eggs. NIST SP 800-28: Guidelines on Active Content and Mobile Code. October 2001.
17 “Subversion” is used to describe subversion of people (e.g. developers), subversion of machines or network nodes, subversion of software, and of other things. [Anderson 2004]
1 “Trustworthiness” is thus dependent on the situation and the entity extending the trust (e.g. entity’s degree of risk aversion) and not an invariant characteristic of the software
2For some sources on the WWW, see for example, Google’s “Definitions on the web” feature and Microsoft’s list of resources to Decode Technical Jargon available at http://www.microsoft.com/learning/start/terms.asp.
3Availability may include availability to share.
4For further information on this topic, see Security in the Software Lifecycle, Section 3.1.3, “Other Desirable Properties of Software and Their Impact on Security”.
5Another definition of security is, “All aspects related to defining, achieving, and maintaining confidentiality, integrity, availability, accountability, authenticity, and reliability.” [ISO/IEC13335-1].
6 According to the Merriam-Webster Dictionary, confidence is “the quality or state of being certain”. Trust is related to confidence, but trust is often bestowed without explicit justification for confidence. The entity trusted may or may not be trustworthy. In order to prepare for violations of trust, computer systems enforce accountability. Moreover, an entity’s history of behavior will directly affect that entity’s reputation. Particularly in e-commerce, mechanisms exist for publicizing each buyer’s and seller’s accountability, history, and reputation in order to provide evidence that others can then use in making their own determinations of whether confidence in the entity is justified, and thus whether they should trust the entity.
7It has been suggested by comments that: While it is possible to add grounds for confidence with activities such as independent evaluation, the bulk of the wherewithal for assurance might be expected to be satisfied as a by-product from processes that produce high assurance product for how else would the producers and their superiors rationally have and maintain high confidence themselves. (And that the absence of such by-products is grounds supporting a determination of lower confidence.)
8 While the term “operational product” is used here, concern for understanding, use, and evolution (as well as assurance) result in the need for a number of engineering artifacts such as specifications and designs that although some might not consider “operational” are nevertheless part of the “product” during its operational period.
9 Sometimes called an “assurance argument;” in this guide the term “assurance argument” or just “argument” is used for the arguments that connect the evidence to the assurance claims/conclusions.
10 Internet sites with material aiding in learning about assurance cases – albeit with a safety emphasis – include http://adelard.co.uk/iee_pn/ and http://www.esafetycase.com/
11The objective of SafSec, developed by Praxis High Integrity Systems is to provide a systems certification and accreditation (C&A) methodology that addresses both safety and security acceptance requirements. SafSec was originally developed for C&A of the United Kingdom Ministry of Defense (UK MOD) Integrated Modular Avionics, and other advanced avionics architectures. SafSec is an example of how approaches from assurance of safety as a required property of software have been successfully applied to the assurance of security as a property of software.
12To some in the safety community, a main distinction is not just probability/accidents vs. possibility/attacks but rather, whether human life (or substantial assets outside the system under scrutiny) is threatened in case of a malfunctioning system (which may malfunction due to accidents, attacks, or whatever). Since spies and soldiers have died because of security compromises, this clearly is a thought emphasizing the asset of most immediate concern. On one hand (safety), the system is visualized as affecting the real world causing hazards to be actualized and, on the other hand (security), as protecting the integrity, confidentiality, or availability of computerized data or computing processes. However, since in both cases the key cost elements are the real-world consequences, this distinction can be overstated.
13While the literature often does so, this guide seldom uses the term “threat“ without a modifier because when used alone it is may have several different meanings
14 Federal Information Processing Standard (FIPS) 199, Standards for Security Categorization of Federal Information and Information Systems. Gaithersburg, MD: NIST, February 2004, defines a standard methodology for categorizing information (or data) assets according to their security attributes. FIPS 199 is available at http://csrc.nist.gov/publications/fips/index.html (February 2004).
15 Usable security is a significant issue and is addressed in both the Requirements and Design sections.
16 Certainly one would never want a critical structure such as a bridge to be built using a structural arrangement whose behavior could not be analyzed and predicted. Why should this be different for critical software?
17 In theory, game theory techniques that do not require the probabilities to be known could be applicable, but little progress has been made in practice.
18 ALARP is a significant concept in UK law, and an excellent engineering-oriented discussion of it appears in Annex B of DEF STAND 00-56 Part 2 [Ministry of Defence 2004b]
19 Defending everything may not be possible or may waste resources. “He who defends everything defends nothing.” – Frederick II
20 Noteworthy exceptions include the EC-Council Certified Secure Programmer (ECSP) and Certified Secure Application Developer (CSAD) certifications offered by the International Council of Electronic Commerce Consultants (E-Council).
21 The Information Systems Security Association lists a number of professional security certifications on their website, at http://www.issa.org/certifications.html.
22The first books enumerating steps to produce software appeared in the early 1960’s – if not earlier. Software process has been an active area of work in industry, research, and government ever since – within this has been significant work on processes for high-dependability systems. Today, a plethora of books contain general-purpose practices and processes. These range from lightweight processes placing few requirements on developers to heavyweight ones that provide a high level of guidance, discipline, and support. [Boehm 2003] Generally and not surprisingly, success in producing high-dependability systems aimed at safety or security has been greater with software processes closer to the heavyweight end of the spectrum and performed by highly skilled people.
23 Areas potentially covered by organizational security policies are listed in subsection .
24 Including protection from cyberstalking
25 David Elliott Bell and Leonard J. LaPadula, “Secure computer systems: mathematical foundations”. MITRE Corporation, 1973 - and - “Secure computer systems: unified exposition and MULTICS interpretation”. MITRE Corporation, 1976.
26 K. J. Biba. “Integrity Considerations for Secure Computer Systems” (in MITRE Technical Report TR-3153). The MITRE Corporation, April 1977.
27 Source: DOD 5200.28-STD, Department of Defense Trusted Computer Evaluation Criteria, December 1985.
28 Other forensic support includes support for identifying suspects and investigating insiders and outsiders. For insiders where the identity of the user may be known, automated recognition of use in an unusual fashion could help support identification of suspects.
29 If specified formally, this can allow static analysis of conformity of designs and code potentially adding creditable assurance evidence.
30 “Secret sharing“ is actually secret splitting in such a way that one must know t out of n parts to know the secret and, if one knows fewer parts, one knows nothing about the secret.
31 While presumably produced by specialists and ready for certification, the US Federal government certifiers of cryptographic software report that substantial fractions of submittals have serious problems and 90% have some problem. (CCUF II 2005)
1 Available at http://onlineethics.org/codes/softeng.html
2 A listing of the laws related to cyber security that were passed in 2005 in all the states is available at: http://www.cscic.state.ny.us/msisac/news/
3 See http://www.cdt.org/privacy/eudirective/EU_Directive_.html
4 See http://www.export.gov/safeharbor/
5 A number of relevant links are given at http://www.cdt.org/privacy/guide/basic/fips.html
6 Standard ... (7) An agreement among any number of organizations that defines certain characteristics, specification, or parameters related to a particular aspect of computer technology. [IEEE Std 100-1996, The IEEE Standard Dictionary of Electrical and Electronic Terms, Sixth Edition]
1Security in the Software Lifecycle, Section 4.1.1, “Software Risk Analysis and Threat Modeling Methodologies” describes a number of current approaches and supporting tools for performing software threat and risk analyses.
2 See, for example, http://lcic.org/ha.html
3 The JDCSISSS is a technical supplement to both the NSA/CSS Manual 130-1 and DIAM 50-4 and provides procedural guidance for the protection, use, management, and dissemination of SCI.
4 Chairman of the Joint Chiefs of Staff Manual (CJCSM) 6510.01, “Defense-in-Depth: Information Assurance (IA) and Computer Network Defense (CND)”, CH 1 (current as of 18 March 2005).
5 In the US DoD this could include mission impact assessment classified in accordance with DOD 5200.1-R and DOD Instruction 3600.2.
6 See http://chacs.nrl.navy.mil/publications/handbook/SPM.pdf (Accessed 20050917)
7Also remember that – as in “unsecured” systems – while external behavior may be recorded separately from internal design, the problems are intertwined and humans have legitimate trouble thinking of only one of them without the other.
1This section draws heavily on an existing larger compilation from a number of sources in [Redwine 2005b]. Portions of these principles were first collected in [SDI 1992] or [Neumann 2003] as well as [NIST Special Pub 800-27] and are described in more detail there. Discussions of some appear in [Howard and LeBlanc 2003], [Viega and McGraw 2002], and [Bishop 2003].
2In current practice, one super user plus auditing their behavior is more often used. Of course, this implies the super user cannot tamper with auditing or audit logs/records.
3Quite apart from security, good practice would call for incorporating prototypes into a production version only if the prototype was designed with production standards in mind. Otherwise one might be going “live” with something never designed for a production environment.
4 The International Association of Software Architects site is at http://www.iasahome.org/iasaweb/appmanager/home/home and contains relevant material.
5 Center for Internet Security website is www.cisecurity.org
1 Of potential future interest, but not covered is MC/DC testing per FAA 178B avionics standard.
2Should we include future and variant versions as well as software using work products, methods, techniques, or ideas from this effort or product? What counts as harm? How insignificant a harm can be safely ignored? What about offsetting benefits, say in fewer accidents or reduced security spending for physical security?
3 One confounding factor is that a number of such projects have been government procurements where contractors or groups of contractors have “pushed back” and possibly less than highly knowledgeable government negotiators have retreated. In other software areas, governments have been said to retreat from such arguments as, “We cannot hire enough programmers knowing and, if we have to train them and bring them up to speed, your whole system’s tight schedule will slip.” Of course, history seems to indicate that major systems have significant slippage deriving from multiple causes.
5 Personal communication with author (SR) September 26, 2005
6 ISMS Users Group, http://www.xisec.com/ These certifications were clearly to the old version of ISO 17799.
7 Thanks to Mark Blackburn for pointing this out.
8 ALARP is a significant concept in UK law, and an excellent engineering-oriented discussion of it appears in Annex B of DEF STAND 00-56 Part 2.
1 Readers should be aware of the NIST SAMATE Project and its Nov. 2006 Workshop on Software Security Assurance Tools, Techniques, and Metrics
2 See http://directory.fsf.org/libsafe.html
3 See http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dncode/html/secure01142004.asp
4 See http://faculty.erau.edu/korn/publ05.html
5 “DO” is a DoD acronym the expansion of which has been lost.
6 See http://samate.nist.gov
1An mp3 recording of a discussion with Clifford Berg on security and agile processes is available at http://agiletoolkit.libsyn.com/index.php?post_id=53799 (200603) within which he describes a number of key issues to an experience agile dévote.
2Microsoft Academic Days on Trustworthy Computing, 7 April 2006
3 Personal communication with author (SR) September 26, 2005.
4 This document does not address the unique security issues involved in processes for developing cryptographic software.
5 See http://www.inrialpes.fr/vasy/fmics/
1 For more material see the management section of the Build-Security-In website http://buildsecurityin.us-cert.gov
1 The most widely recognized Body of Knowledge on business continuity planning is Professional Practices for the Business Continuity Planner, jointly published by DRI (Disaster Recovery Institute) International in the US and Business Continuity International (BCI) in the UK. This document is available at http://www.drii.org/displaycommon.cfm?an=2.
1 See http://17799-news.the-hamster.com/issue09-news1.htm (accessed 2005/09/17)
2 This language is provided to provoke thought and has not been reviewed by attorneys for its efficacy.
3 This language is provided to provoke thought and has not been reviewed by attorneys for its efficacy.
1Items in this section often reflect generally held opinions or individual experiences as selected by the author and editor. As with all contents of this document, users must exercise their own judgment concerning them. See disclaimer on back of title page.
2 Textbooks exist such as [Pfleeger 2003].
3 [Kelly 1994] further addresses some of the organizational issues, e.g. on page 55.
4 A tutorial on using SafSec for security is planned for IEEE International Symposium on Secure Software Engineering, March 13, 2006, Washington D.C. area.
5 A construction-relevant body of knowledge is [Pomeroy-Huff 2005] for the Personal Software Process.
6 For CISSE see http://www.ncisse.org/conferences.htm