• Tidak ada hasil yang ditemukan

System Attributes

information that we have; that “warm fuzzy” feeling that we can use the information with some level of assurance that all, or at least most, of the attributes discussed here have been adequately satisfied. i.e., the information is timely, accurate, relevant, authentic, etc.

20 Building A Global Information Assurance Program

Security

The security attributes include cyber security, physical security, information security (InfoSec), authorization, dispersion, deception, and others. Security can be defined as [FM 100–6]:

䡲 Protection from information compromise, including unauthorized access to information or services

䡲 Prevention of denial of service, including loss or disruption of infor- mation exchanges or services

The level of security required depends on the nature of the information to be protected and the threat (or perceived threat) of interception or exploitation.

Security of information systems is generally achieved by a combination of physical access control, user authentication, and protection of electronic, online communication. Security maintains the integrity of the organization and the information system, but must be balanced by the need to disseminate critical information quickly. If absolute security is our primary concern, we can simply sever all connections to anything outside the system itself, isolate it in a safe or secure room, lock the door, and post a guard. A system thus protected is not likely to be compromised. However, it is also very nearly useless. A delicate balance exists between the level of security that is necessary or desired and the need for usability. There is no such thing as a totally (100 percent) secure system; vulnerabilities always exist. The questions then are:

䡲 What level of risk is acceptable?

䡲 What level of security is achievable?

䡲 What is the value of the information or the cost of compromise?

䡲 What is the cost of the required protection measures?

Cyber Security

The term cyber security has been used to include both the physical security of the system and the electronic security of the information residing in the system. Traditionally, security best practice has advocated a layered security approach, and the layers almost always start with limiting physical access [TCSEC, 1985].

Physical Security

The best way to protect an information system and the information on that system is to prevent unauthorized access to that system. The physical security of the system typically includes a number of layers of access control from perimeter security (fences, guards, surveillance), facility security (building access control, monitoring), and hardware security (physical access to work- stations themselves) to user authentication and authorization.

System assurance: Assurance is the system characteristic enabling confi- dence that the system fulfills its intended purpose. It is fundamental to security that the system implementation is of sufficient quality to provide confidence in the correct operation of security mechanisms and in the system’s resistance to deliberate or unintentional penetration. Technology has been developed to produce and measure the assurance of informa- tion systems. System assurance can be increased by using simple solu- tions, using higher assurance components, architecting to limit the impact of penetrations, and including trustworthy detection and recovery capa- bilities. System assurance both supports the architecture and spans it.

Operating system (OS) security services: System security ultimately depends on the underlying operating system mechanisms. If these underlying supports are weak, then security can be bypassed or sub- verted. System security can be no stronger than the underlying operating system.

Distributed system security services: While some services reside in a particular logical level of the system hierarchy, many are implemented via mechanisms that span the system both physically and logically.

Security Domains

A foundation for information systems security (ISS) is the concept of security domains and enforcement of data and process flow restrictions within and between these domains [ISSEP, 2000].

䡲 A domain is a set of active entities (person, process, or device), their data objects, and a common security policy.

䡲 Domains can be logical as well as physical; dividing an organization’s computing enterprise into domains is analogous to building fences (various types of security barriers), placing gates within the fences (e.g., firewalls, gateways, and internal process separation), and assigning guards to control traffic through the gates (technical and procedural security services).

䡲 Domains are defined using factors that include one or more of the following:

䡲 Physical (e.g., building, campus, region, etc.) 䡲 Business process (e.g., personnel, finance, etc.)

䡲 Security mechanisms (e.g., NT domain, network information system (NIS), UNIX groups, etc.).

The key elements to be addressed in defining domains are flexibility, tailored protection, domain interrelationships, and the consideration of multi- ple perspectives of what is important in information system security.

22 Building A Global Information Assurance Program

Authorization

Authorization is a system attribute, as defined here, and an information attribute, as defined earlier. Individuals should only be able to execute trans- actions or perform operations for which they have been granted permission (such as approving a purchase or withdrawing cash at an ATM); in other words, only if they have the appropriate signing authority [Entrust, 2000]. The intent here is to prevent deception, the intention or tendency to mislead or deceive.

Versatility

Versatility is the ability to adapt readily to unforeseen requirements. It is some combination of system autonomy, flexibility, interoperability, and robustness.

Often, these are contentious.

Autonomy represents the ability of systems to function independently.

Most information systems, especially large-scale systems, require some interconnectivity to perform the operations for which they were intended. However, when planned or unplanned service interruptions occur, systems should be able to carry on, at least in some limited capacity.

Flexibility is responsiveness to change, specifically as it relates to user information needs and the operational environment. Planners must be flexible in supporting information systems requirements in changing situations. They should anticipate the possibility of changes in the organizational mission or situation and build a plan to accommodate these changes.

Interoperability can be defined as the ability of systems, units, or organizations to provide services to and accept services from other systems, units, or organizations, and to use the exchanged services to operate effectively together. Interoperability is the capability of infor- mation systems to work together as a system of systems. Interoperability implies compatibility of combined and organizationally common infor- mation or data elements and procedures, and is the foundation on which information systems capabilities depend. An interoperable infor- mation system is visible at all functional levels; a secure, seamless, cohesive infrastructure that satisfies system and information connectivity requirements from the highest levels of management to the lowest information request. Information systems should comply with the orga- nization’s formal information system technical architecture. Adherence to the standards and protocols defined therein helps ensure interoper- ability and seamless exchange of information between organizational components. Older, legacy information systems that do not comply with the system architecture and accepted standards will require special planning and may not be interoperable [FM 100–6].

of unexpected disturbances.

Continuity

Continuity is the uninterrupted availability of information paths for the effective performance of organizational functions. Applying the subordinate elements of survivability, reliability, redundancy, and connectivity results in continuity.

Global reach is achieved electronically, quickly, and often with a seamless architecture to support the requirements of the managers and their staffs.

Connectivity is absolutely essential to the deployment and agility required of real-time systems [FM 100–6]. Continuity is a measure of the systems availability or readiness.

Connectivity

Connectivity is very similar to interoperability. In order to function in a widespread, diverse computing environment, the system must be connected to and communicate with that environment in some fashion, and remain so for the required duration of the operations requiring interaction with other systems.

Redundancy

Duplication of functional capability is generally built into information systems.

The amount and complexity of that redundancy is usually dependent on the criticality of the system and the information resident in the system, or both.

Redundancy can be complex and expensive, but is essential, at some level, for critical systems.

From an information systems network perspective, planners provide diverse paths over multiple means to ensure timely, reliable information flow. From an equipment perspective, planners ensure that sufficient backup systems and repair parts are available to maintain system or network capabilities [JP 6–0, 6–02; FM 100–6].

Reliability

Reliability is a measure of system dependability. From a systems engineering perspective, it can be defined as:

䡲 The probability that a system successfully operates to time t [Eisner, 1997].

䡲 The likelihood of mission success, given that a system was available to operate at the beginning of the mission [Eisner, 1987].

24 Building A Global Information Assurance Program

Some systems operate continuously, others remain idle for long periods of time until they are needed. In either case, it is important that a system is up and operating when called on. Reliability is closely related to availability.

Survivability

Survivability is a measure of the system’s ability to function under less-than- optimal, degrading circumstances. Information systems must be reliable, robust, resilient, and at least as survivable as the supported organization.

Distributed systems and alternate means of communication provide a measure of resilience. Systems must be organized and positioned to ensure that per- formance under adverse conditions degrades gradually and not catastrophically [FM 100–6].

Simplicity

Simplicity is a measure of the complexity of the environment. It includes standardization and technical sophistication.

Standardization

Standardization can be both a blessing and a curse. Systems, in general, are not built as a single, unified whole, nor are they constructed, integrated, and operated by any one entity; the very few, large, complex, global systems that are, must still interface with other systems or components. In order to facilitate this interaction and interconnectivity, some minimal set of hardware and software interface standards must be adopted. As we shall see later, one of the biggest issues with interoperability of systems today is the lack standards or lack of adherence to existing standards.

On the other hand, standards can stifle creativity and technological advance- ment. If we are bound to an existing and (by definition) aging standard, we are often precluded from inventing and adopting a better mouse trap. There must be an informed, technically sound trade-off between these two extremes.

Examples of successful application of standards to large-scale, diverse, ubiquitous systems abound. Electrical power distribution grids, and telephone systems come to mind. In addition, the computing world has successively adopted standards for computer boards, serial and parallel interfaces, modems, local networks, and the World Wide Web. Unfortunately, not all global infor- mation systems have been similarly successful.

Technical Sophistication

Like standards, technology can be both a help and a hindrance. Not enough technology can keep systems from performing at desired or required levels.

Too much technology can overly complicate the system, requiring an inordinate

amount of time, money, manpower, and other valuable resources to keep it running smoothly. In addition, advanced technology is not always a good thing. Examples abound of technologically advanced or superior products that failed to attain widespread acceptance. Sony’s Beta video technology and IBM’s PC Micro-Channel Architecture (MCA) are good examples. While both were generally recognized as technically superior standards, they failed to proliferate and gain market share, and consequently fell into virtual disuse (as with MCA) or niche use (Beta). The result is difficulty supporting, expand- ing, and implementing such systems.

An alternative view of how these attributes might interrelate is shown in Exhibit 5.

Information System Support Planning Principles

When planning or designing any system, the first and probably the most critical step in the entire process is the collection, definition, and analysis of requirements. Each of the attributes described within these pages embodies one or more requirements that must be taken into account. Thus, system planning principles include most of the system and information attributes described previously in addition to many others too numerous to mention here. Despite a certain level of redundancy with the foregoing discussion, we have chosen to present nine principles that seem to be central to a solid foundation. The information systems planning principles discussed here are derived from Department of Defense Joint Publications [JP 6–0, 6–02], but represent a solid foundation for all information system planning and design.

These principles focus the planner’s attention on what is important to the user. Accountability, flexibility, interoperability, redundancy, security, standard- ization, and survivability have been addressed already and will not be repeated here. However, in addition to these we have the following:

Exhibit 5 Information System Attributes

Information Systems Support Principles

- Redundancy - Connectivity

Versatility - Flexibility - Interoperability - Autonomy

Security - InfoSec - Physical Security - Dispersion - Deception Simplicity

- Technological Sophistication - Standardization

26 Building A Global Information Assurance Program

Economy (of Scale)

Scalable system packages ease the application of economy. Space, weight, or time constraints limit the quantity or capability of systems that can be deployed.

Information requirements must be satisfied by consolidating similar functional facilities, integrating commercial systems into organizational information net- works, or resorting to a different information system.

Modularity

Modularity is distinguished by small packages consisting of sets of equipment, people, and software adaptable for a wide range of missions. Planners must understand the mission, the leader’s intent and operational plan, the availability of assets, and the information structure required to meet the needs of each operation. These packages must satisfy the organization’s informational require- ments during the execution phases of the mission. Modular information systems packages must be flexible, easily scaled, and tailored with respect to capability.

Affordability

Affordability is the extent to which information system features are cost effective on both a recurring and nonrecurring basis. It is perhaps the major factor in system procurements. From the federal government to industry, cost has always been a driving concern. The important point here is to match the cost with the benefits of the system. Cost benefit analyses are routinely conducted for all manner of purchases, acquisitions, and updates. The one area where this trade-off is lacking, however, is in the area of systems and information security features. This will be discussed in greater detail when we talk about risk, threat, and vulnerability in Chapter 3.

All of the attributes discussed thus far have appeared in a variety of other IA-related works. However, there is one system attribute that the authors have not seen discussed elsewhere.

Maintainability

All systems require maintenance of one sort or another ,and the concept of maintainability may be implied in some of the other “ilities” we have presented here. However, for clarity, let us break this attribute out so that it can be addressed specifically. Maintainability, in systems engineering parlance, is defined as ”the general ease of a system to be maintained, at all levels of maintenance” [Eisner, 1987].

Most new, major hardware system acquisitions today (such as aircraft, ships, automobiles, etc.) are specifically designed with maintainability in mind.

Information systems are no different. There are a number of steps that can be taken, from both the hardware and software perspectives, that will decrease the maintenance costs, while increasing reliability and availability. Though a detailed discussion of these issues is beyond our scope here, maintainability