Which of the following factors should an IS auditor primarily focus on when determining the appropriate level of protection for an information asset?

Risk Management

David Watson, Andrew Jones, in Digital Forensics Processing and Procedures, 2013

5.5.1 Overview

Information security risk management is the systematic application of management policies, procedures, and practices to the task of establishing the context, identifying, analyzing, evaluating, treating, monitoring, and communicating information security risks.

Information Security Management can be successfully implemented with an effective information security risk management process. There are a number of national and international standards that specify risk approaches, and the Forensic Laboratory is able to choose which it wishes to adopt, though ISO 27001 is the preferred standard and the Forensic Laboratory will want to be Certified to this standard. A list of some of these is given in Section 5.1.

An ISMS is a documented system that describes the information assets to be protected, the Forensic Laboratory’s approach to risk management, the control objectives and controls, and the degree of assurance required. The ISMS can be applied to a specific system, components of a system, or the Forensic Laboratory as a whole.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597497428000054

Thinking About Risk

Stephen D. Gantz, Daniel R. Philpott, in FISMA and the Risk Management Framework, 2013

Information Security Risk

Information security risk comprises the impacts to an organization and its stakeholders that could occur due to the threats and vulnerabilities associated with the operation and use of information systems and the environments in which those systems operate. The primary means of mitigating information security-related risk is through the selection, implementation, maintenance, and continuous monitoring of preventive, detective, and corrective security controls to protect information assets from compromise or to limit the damage to the organization should a compromise occur. Information security risk overlaps with many other types of risk in terms of the kinds of impact that might result from the occurrence of a security-related incident. It is also influenced by factors attributed to other categories of risk, including strategic, budgetary, program management, investment, political, legal, reputation, supply chain, and compliance risk.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597496414000035

Introduction

In Information Security Risk Assessment Toolkit, 2013

Information Security Risk Assessment Toolkit details a methodology that adopts the best parts of some established frameworks and teaches you how to use the information that is available (or not) to pull together an IT Security Risk Assessment that will allow you to identify High Risk areas. Whether your objective is to forecast budget items, identify areas of operational or program improvement, or meet regulatory requirements we believe this publication will provide you with the tools to execute an effective assessment and more importantly, adapt a process that will work for you. Many of the tools that we’ve developed to make this process easier for us are available as a companion for this publication at http://booksite.syngress.com/9781597497350. We hope that you find our methodology, and accompanying tools, as useful in executing your IT Security Risk Assessments as we have.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597497350000178

Risk Management

Sokratis K. Katsikas, in Computer and Information Security Handbook (Second Edition), 2013

2 Expressing and Measuring Risk

Information security risk “is measured in terms of a combination of the likelihood of an event and its consequence.”8 Because we are interested in events related to information security, we define an information security event as “an identified occurrence of a system, service or network state indicating a possible breach of information security policy or failure of safeguards, or a previously unknown situation that may be security relevant.”9 Additionally, an information security incident is “indicated by a single or a series of unwanted information security events that have a significant probability of compromising business operations and threatening information security.”10 These definitions actually invert the investment assessment model, where an investment is considered worth making when its cost is less than the product of the expected profit times the likelihood of the profit occurring. In our case, the risk R is defined as the product of the likelihood L of a security incident occurring times the impact I that will be incurred to the organization due to the incident, that is, R=L x I.11

To measure risk, we adopt the fundamental principles and the scientific background of statistics and probability theory, particularly of the area known as Bayesian statistics, after the mathematician Thomas Bayes (1702–1761), who formalized the namesake theorem. Bayesian statistics is based on the view that the likelihood of an event happening in the future is measurable. This likelihood can be calculated if the factors affecting it are analyzed. For example, we are able to compute the probability of our data to be stolen as a function of the probability an intruder will attempt to intrude into our system and of the probability that he will succeed. In risk analysis terms, the former probability corresponds to the likelihood of the threat occurring and the latter corresponds to the likelihood of the vulnerability being successfully exploited. Thus, risk analysis assesses the likelihood that a security incident will happen by analyzing and assessing the factors that are related to its occurrence, namely the threats and the vulnerabilities. Subsequently, it combines this likelihood with the impact resulting from the incident occurring to calculate the system risk. Risk analysis is a necessary prerequisite for subsequently treating risk. Risk treatment pertains to controlling the risk so that it remains within acceptable levels. Risk can be reduced by applying security measures; it can be shared, by outsourcing or by insuring; it can be avoided; or it can be accepted, in the sense that the organization accepts the likely impact of a security incident.

The likelihood of a security incident occurring is a function of the likelihood that a threat appears and of the likelihood that the threat can successfully exploit the relevant system vulnerabilities. The consequences of the occurrence of a security incident are a function of the likely impact that the incident will have to the organization as a result of the harm that the organization assets will sustain. Harm, in turn, is a function of the value of the assets to the organization. Thus, the risk R is a function of four elements: (a) V, the value of the assets; (b) T, the severity and likelihood of appearance of the threats; (c) V, the nature and the extent of the vulnerabilities and the likelihood that a threat can successfully exploit them; and (d) I, the likely impact of the harm should the threat succeed, that is, R=f(A, T, V, I).

If the impact is expressed in monetary terms, the likelihood being dimensionless, then risk can be also expressed in monetary terms. This approach has the advantage of making the risk directly comparable to the cost of acquiring and installing security measures. Since security is often one of several competing alternatives for capital investment, the existence of a cost/benefit analysis that would offer proof that security will produce benefits that equal or exceed its cost is of great interest to the management of the organization. Of even more interest to management is the analysis of the investment opportunity costs, that is, its comparison to other capital investment options.12 However, expressing risk in monetary terms is not always possible or desirable, since harm to some kinds of assets (human life) cannot (and should not) be assessed in monetary terms. This is why risk is usually expressed in nonmonetary terms, on a simple dimension-less scale.

Assets in an organization are usually quite diverse. Because of this diversity, it is likely that some assets that have a known monetary value (hardware) can be valued in the local currency, whereas others of a more qualitative nature (data or information) may be assigned a numerical value based on the organization’s perception of their value. This value is assessed in terms of the assets’ importance to the organization or their potential value in different business opportunities. The legal and business requirements are also taken into account, as are the impacts to the asset itself and to the related business interests resulting from a loss of one or more of the information security attributes (confidentiality, integrity, availability). One way to express asset values is to use the business impacts that unwanted incidents, such as disclosure, modification, nonavailability, and/or destruction, would have to the asset and the related business interests that would be directly or indirectly damaged. An information security incident can impact more than one asset or only a part of an asset. Impact is related to the degree of success of the incident. Impact is considered as having either an immediate (operational) effect or a future (business) effect that includes financial and market consequences. Immediate (operational) impact is either direct or indirect.

Direct impact may result because of the financial replacement value of lost (part of) asset or the cost of acquisition, configuration and installation of the new asset or backup, or the cost of suspended operations due to the incident until the service provided by the asset(s) is restored. Indirect impact may result because financial resources needed to replace or repair an asset would have been used elsewhere (opportunity cost) or from the cost of interrupted operations or due to potential misuse of information obtained through a security breach or because of violation of statutory or regulatory obligations or of ethical codes of conduct.13

These considerations should be reflected in the asset values. This is why asset valuation (particularly of intangible assets) is usually done through impact assessment. Thus, impact valuation is not performed separately but is rather embedded within the asset valuation process.

The responsibility for identifying a suitable asset valuation scale lies with the organization. Usually, a three-value scale (low, medium, and high) or a five-value scale (negligible, low, medium, high, and very high) is used.14

Threats can be classified as deliberate or accidental. The likelihood of deliberate threats depends on the motivation, knowledge, capacity, and resources available to possible attackers and the attractiveness of assets to sophisticated attacks. On the other hand, the likelihood of accidental threats can be estimated using statistics and experience. The likelihood of these threats might also be related to the organization’s proximity to sources of danger, such as major roads or rail routes, and factories dealing with dangerous material such as chemical materials or oil. Also the organization’s geographical location will affect the possibility of extreme weather conditions. The likelihood of human errors (one of the most common accidental threats) and equipment malfunction should also be estimated.15 As already noted, the responsibility for identifying a suitable threat valuation scale lies with the organization. What is important here is that the interpretation of the levels is consistent throughout the organization and clearly conveys the differences between the levels to those responsible for providing input to the threat valuation process. For example, if a three-value scale is used, the value low can be interpreted to mean that it is not likely that the threat will occur, there are no incidents, statistics, or motives that indicate that this is likely to happen. The value medium can be interpreted to mean that it is possible that the threat will occur, there have been incidents in the past or statistics or other information that indicate that this or similar threats have occurred sometime before, or there is an indication that there might be some reasons for an attacker to carry out such action. Finally, the value high can be interpreted to mean that the threat is expected to occur, there are incidents, statistics, or other information that indicate that the threat is likely to occur, or there might be strong reasons or motives for an attacker to carry out such action.16

Vulnerabilities can be related to the physical environment of the system, to the personnel, management, and administration procedures and security measures within the organization, to the business operations and service delivery or to the hardware, software, or communications equipment and facilities. Vulnerabilities are reduced by installed security measures. The nature and extent as well as the likelihood of a threat successfully exploiting the three former classes of vulnerabilities can be estimated based on information on past incidents, on new developments and trends, and on experience. The nature and extent as well as the likelihood of a threat successfully exploiting the latter class, often termed technical vulnerabilities, can be estimated using automated vulnerability-scanning tools, security testing and evaluation, penetration testing, or code review.17 As in the case of threats, the responsibility for identifying a suitable vulnerability valuation scale lies with the organization. If a three-value scale is used, the value low can be interpreted to mean that the vulnerability is hard to exploit and the protection in place is good. The value medium can be interpreted to mean that the vulnerability might be exploited, but some protection is in place. The value high can be interpreted to mean that it is easy to exploit the vulnerability and there is little or no protection in place.18

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123943972000532

Risk Management

Stephen D. Gantz, Daniel R. Philpott, in FISMA and the Risk Management Framework, 2013

Risk Management

The Federal Information Security Management Act defines information security as “the protection of information and information systems from unauthorized access, use, disclosure, disruption, modification, or destruction” in order to safeguard their confidentiality, integrity, and availability [1]. No organization can provide perfect information security that fully assures the protection of information and information systems, so there is always some chance of loss or harm due to the occurrence of adverse events. This chance is risk, typically characterized as a function of the severity or extent of the impact to an organization due to an adverse event and the likelihood of that event occurring [2]. Organizations identify, assess, and respond to risk using the discipline of risk management. Information security represents one way to reduce risk, and in the broader context of risk management, information security management is concerned with reducing information system-related risk to a level acceptable to the organization. Legislation addressing federal information resources management consistently directs government agencies to follow risk-based decision-making practices when investing in, operating, and securing their information systems, obligating agencies to establish risk management as part of their IT governance [3]. Effective information resources management requires understanding and awareness of types of risk from a variety of sources. Although initial NIST guidance on risk management published prior to FISMA’s enactment emphasized addressing risk at the individual information system level [4], the NIST Risk Management Framework and guidance on managing risk in Special Publication 800-39 now position information security risk as an integral component of enterprise risk management practiced at organization, mission and business, and information system tiers, as illustrated in Figure 13.1.

Which of the following factors should an IS auditor primarily focus on when determining the appropriate level of protection for an information asset?

Figure 13.1. Information Security Risk Management Must Occur At and Between All Levels of the Organization to Enable Pervasive Risk Awareness and to Help Ensure Consistent Risk-Based Decision Making Throughout the Organization [6]

Despite the acknowledged importance of enterprise risk management, NIST explicitly limits the intended use of Special Publication 800-39 to “the management of information security-related risk derived from or associated with the operation and use of information systems or the environments in which those systems operate” [5]. System owners and agency risk managers should not use this narrow scope to treat information security risk in isolation from other types of risk. Depending on the circumstances faced by an organization, the sources of information security risk may impact other enterprise risk areas, potentially including mission, financial, performance, legal, political, and reputation forms of risk. For instance, a government agency victimized by a cyber attack may suffer monetary losses from allocating resources necessary to respond to the incident and may also experience reduced mission delivery capability that results in a loss of public confidence. Enterprise risk management practices need to incorporate information security risk to develop a complete picture of the risk environment for the organization. Similarly, organizational perspectives on enterprise risk—particularly including determinations of risk tolerance—may drive or constrain system-specific decisions about functionality, security control implementation, continuous monitoring, and initial and ongoing system authorization.

Information security risk management may look somewhat different from organization to organization, even among organizations like federal government agencies that often follow the same risk management guidance. The historical pattern of inconsistent risk management practices among and even within agencies led NIST to reframe much of its information security management guidance in the context of risk management as defined in Special Publication 800-39, a new document published in 2011 that offers an organizational perspective on managing risk associated with the operation and use of information systems [7]. Special Publication 800-39 defines and describes at a high level an overarching four-phase process for information security risk management, depicted in Figure 13.2, and directs those implementing the process to additional publications for more detailed guidance on risk assessment [8] and risk monitoring [9]. In its guidance, NIST reiterates the essential role of information technology to enable the successful achievement of mission outcomes and ascribes similar importance to recognizing and managing information security risk as a prerequisite to attaining organizational goals and objectives. NIST envisions agency risk management programs characterized by [10]:

Which of the following factors should an IS auditor primarily focus on when determining the appropriate level of protection for an information asset?

Figure 13.2. NIST Defines an Integrated, Iterative Four-Step Risk Management Process That Establishes Organizational, Mission and Business, and Information System-Level Roles and Responsibilities, Activities, and Communication Flows [11]

Senior leaders that recognize the importance of managing information security risk and establish appropriate governance structures for managing such risk.

Effective execution of risk management processes across organization, mission and business, and information systems tiers.

An organizational climate where information security risk is considered within the context of mission and business process design, enterprise architecture definition, and system development life cycle processes.

Better understanding among individuals with responsibilities for information system implementation or operation of how information security risk associated with their systems translates into organization-wide risk that may ultimately affect mission success.

Managing information security risk at an organizational level represents a potential change in governance practices for federal agencies and demands an executive-level commitment both to assign risk management responsibilities to senior leaders and to hold those leaders accountable for their risk management decisions and for implementing organizational risk management programs. The organizational perspective also requires sufficient understanding on the part of senior management to recognize information security risks to the agency, establish organizational risk tolerance levels, and communicate information about risk and risk tolerance throughout the organization for use in decision making at all levels.

Key Risk Management Concepts

Federal risk management guidance relies on a core set of concepts and definitions that all organizational personnel involved in risk management should understand. Risk management is a subjective process, and many of the elements used in risk determination activities are susceptible to different interpretations. NIST provided explicit examples, taxonomies, constructs, and scales in its latest guidance on conducting risk assessments [12] that may encourage more consistent application of core risk management concepts, but ultimately each organization is responsible for establishing and clearly communicating any organization-wide definitions or usage expectations. To the extent that organizational risk managers can standardize and enforce common definitions and risk rating levels, the organization may be able to facilitate the necessary step of prioritizing risk across the organization that stems from multiple sources and systems. NIST guidance adopts definitions of threat, vulnerability, and risk from the Committee on National Security Systems (CNSS) National Information Assurance Glossary[13], and uses tailored connotations of the terms likelihood and impact applied to risk management in general and risk assessment in particular [14].

Threats

A threat is “any circumstance or event with the potential to adversely impact organizational operations (including mission, functions, image, or reputation), organizational assets, individuals, other organizations, or the Nation through an information system via unauthorized access, destruction, disclosure, modification of information, and/or denial of service.” NIST guidance distinguishes between threat sources—causal agents with the capability to exploit a vulnerability to cause harm—and threat events: situations or circumstances with adverse impact caused by threat sources [15]. Risk managers need to consider a wide variety of threat sources and potentially relevant threat events, drawing upon organizational knowledge and characteristics of information systems and their operating environments as well as external sources of threat information. In its revised draft of Special Publication 800-30, NIST categorizes threat sources into four primary categories—adversarial, accidental, structural, and environmental—and provides an extensive (though not comprehensive) list of over 70 threat events [16].

Vulnerabilities

A vulnerability is a “weakness in an information system, system security procedures, internal controls, or implementation that could be exploited by a threat source.” Information system vulnerabilities often stem from missing or incorrectly configured security controls (as described in detail in Chapters 8 and 11Chapter 8Chapter 9Chapter 10Chapter 11 in the context of the security control assessment process) and also can arise in organizational governance structures, business processes, enterprise architecture, information security architecture, facilities, equipment, system development life cycle processes, supply chain activities, and relationships with external service providers [17]. Identifying, evaluating, and remediating vulnerabilities are core elements of several information security processes supporting risk management, including security control selection, implementation, and assessment as well as continuous monitoring. Vulnerability awareness is important at all levels of the organization, particularly when considering vulnerabilities due to predisposing conditions—such as geographic location—that increase the likelihood or severity of adverse events but cannot easily be addressed at the information system level. Special Publication 800-39 highlights differences in risk management activities related to vulnerabilities at organization, mission and business, and information system levels, summarized in the Three-Tiered Approach section later in this chapter.

Likelihood

Likelihood in a risk management context is an estimate of the chance that an event will occur resulting in an adverse impact to the organization. Quantitative risk analysis sometimes uses formal statistical methods, patterns of historical observations, or predictive models to measure the probability of occurrence for a given event and determine its likelihood. In qualitative or semi-quantitative risk analysis approaches such as the method prescribed in Special Publication 800-30, likelihood determinations focus less on statistical probability and more often reflect relative characterizations of factors such as a threat source’s intent and capability and the visibility or attractiveness of the organization as a target [6]. For emergent vulnerabilities, security personnel may consider factors such as the public availability of code, scripts, or other exploit methods or the susceptibility of systems to remote exploit attempts to help determine the range of potential threat agents that might try to capitalize on a vulnerability and to better estimate the likelihood that such attempts could occur. Risk assessors use these factors, in combination with past experience, anecdotal evidence, and expert judgment when available, to assign likelihood scores that allow comparison among multiple threats and adverse impacts and—if organizations implement consistent scoring methods—support meaningful comparisons across different information systems, business processes, and mission functions.

Impact

Impact is a measure of the magnitude of harm that could result from the occurrence of an adverse event. While positive or negative impacts are theoretically possible, even from a single event, risk management tends to focus only on adverse impacts, driven in part by federal standards on categorizing information systems according to risk levels defined in terms of adverse impact. FIPS 199 distinguishes among low, moderate, and high potential impacts corresponding to “limited,” “serious,” and “severe or catastrophic” adverse effects, respectively [18]. Current NIST guidance on risk assessments expands the qualitative impact levels to five from three, adding very low for “negligible” adverse effects and very high for “multiple severe or catastrophic” adverse effects. This guidance also proposes a similar five-level rating scale for the range or scope of adverse effects due to threat events, and provides examples of adverse impacts in five categories based on the subject harmed: operations, assets, individuals, other organizations, and the nation [19]. Impact ratings significantly influence overall risk level determinations and can—depending on internal and external policies, regulatory mandates, and other drivers—produce specific security requirements that agencies and system owners must satisfy through the effective implementation of security controls.

Warning

The use of standardized rating scales for the severity of threats and vulnerabilities, likelihood of occurrence, impact levels, and risk offers enormous value to organizations seeking consistent application of risk management practices, but the subjective nature of the definitions corresponding to numeric rating scores can produce a false sense of consistency. Risk executives operating at the organization tier need to establish clear rating guidelines and organization-specific interpretations of relative terms such as “limited” and “severe” to help ensure that the ratings are applied in the same way across the organization.

Risk

Risk is “a measure of the extent to which an entity is threatened by a potential circumstance or event” typically represented as a function of adverse impact due to an event and the likelihood of the event occurring. Risk in a general sense comprises many different sources and types that organizations address through enterprise risk management [20]. FISMA and associated NIST guidance focus on information security risk, with particular emphasis on information system-related risks arising from the loss of confidentiality, integrity, or availability of information or information systems. The range of potential adverse impacts to organizations from information security risk include those affecting operations, organizational assets, individuals, other organizations, and the nation. Organizations express risk in different ways and with different scope depending on which level of the organization is involved—information system owners typically identify and rate risk from multiple threat sources applicable to their systems, while mission and business and organizational characterizations of risk may seek to rank or prioritize different risk ratings across the organization or aggregate multiple risk ratings to provide an enterprise risk perspective. Risk is the primary input to organizational risk management, providing the basic unit of analysis for risk assessment and monitoring and the core information used to determine appropriate risk responses and any needed strategic or tactical adjustments to risk management strategy [21].

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597496414000138

Risk Management

Sokratis K. Katsikas, in Computer and Information Security Handbook (Third Edition), 2013

2 Expressing and Measuring Risk

Information security risk “is measured in terms of a combination of the likelihood of an event and its consequence.” Because we are interested in events related to information security, we define an information security event as “an identified occurrence of a system, service or network state indicating a possible breach of information security policy or failure of safeguards, or a previously unknown situation that may be security relevant.”8 In addition, an information security incident is “indicated by a single or a series of unwanted information security events that have a significant probability of compromising business operations and threatening information security.” These definitions actually invert the investment assessment model, in which an investment is considered worth making when its cost is less than the product of the expected profit times the likelihood of the profit occurring. In our case, risk R is defined as the product of likelihood L of a security incident occurring times impact I that will be incurred to the organization owing to the incident: that is, R = L × I.9

To measure risk, we adopt the fundamental principles and scientific background of statistics and probability theory, particularly of the area known as Bayesian statistics, after the mathematician Thomas Bayes (1702–1761), who formalized the namesake theorem. Bayesian statistics is based on the view that the likelihood of an event happening in the future is measurable. This likelihood can be calculated if the factors affecting it are analyzed. For example, we are able to compute the probability of our data being stolen as a function of the probability an intruder will attempt to intrude into our system and the probability that he will succeed. In risk analysis terms, the former probability corresponds to the likelihood of the threat occurring and the latter corresponds to the likelihood of the vulnerability being successfully exploited. Thus, risk analysis assesses the likelihood that a security incident will happen, by analyzing and assessing the factors that are related to its occurrence, namely the threats and the vulnerabilities. Subsequently, it combines this likelihood with the impact resulting from the incident occurring to calculate the system risk. Risk analysis is a necessary prerequisite for subsequently treating risk. Risk treatment pertains to controlling the risk so that it remains within acceptable levels. Risk can be reduced by applying security measures; it can be shared, by outsourcing or by insuring; it can be avoided; or it can be accepted, in the sense that the organization accepts the likely impact of a security incident.

The likelihood of a security incident occurring is a function of the likelihood that a threat appears and the likelihood that the threat can exploit the relevant system vulnerabilities successfully. The consequences of the occurrence of a security incident are a function of the likely impact the incident will have on the organization as a result of the harm that the organization assets will sustain. Harm, in turn, is a function of the value of the assets to the organization. Thus, risk R is a function of four elements: (1) V, the value of the assets; (2) T, the severity and likelihood of appearance of the threats; (3) V, the nature and extent of the vulnerabilities and the likelihood that a threat can successfully exploit them; and (4) I, the likely impact of the harm should the threat succeed: that is, R = f(A, T, V, I).

If the impact is expressed in monetary terms, the likelihood is dimensionless, and then risk can be also expressed in monetary terms. This approach has the advantage of making the risk directly comparable to the cost of acquiring and installing security measures. Because security is often one of several competing alternatives for capital investment, the existence of a cost–benefit analysis that would offer proof that security will produce benefits that equal or exceed its cost is of great interest to the management of the organization. Of even more interest to management is an analysis of the investment opportunity costs: that is, its comparison with other capital investment options.10 However, expressing risk in monetary terms is not always possible or desirable, because harm to some kinds of assets (human life) cannot (and should not) be assessed in monetary terms. This is why risk is usually expressed in nonmonetary terms, on a simple dimensionless scale.

Assets in an organization are usually diverse. Because of this diversity, it is likely that some assets that have a known monetary value (hardware) can be valued in the local currency, whereas others of a more qualitative nature (data or information) may be assigned a numerical value based on the organization's perception of their value. This value is assessed in terms of the assets' importance to the organization or their potential value in different business opportunities. The legal and business requirements are also taken into account, as are the impacts to the asset itself and to the related business interests resulting from loss of one or more of the information security attributes (confidentiality, integrity, or availability). One way to express asset values is to use the business impacts that unwanted incidents, such as disclosure, modification, nonavailability, and/or destruction, would have on the asset and the related business interests that would be directly or indirectly damaged. An information security incident can affect more than one asset or only a part of an asset. Impact is related to the degree of success of the incident. Impact is considered to have either an immediate (operational) effect or a future (business) effect that includes financial and market consequences. An immediate (operational) impact is either direct or indirect.

A direct impact may result because of the financial replacement value of a lost (part of) asset or the cost of acquisition, configuration, and installation of the new asset or backup, or the cost of suspended operations resulting from the incident until the service provided by the asset(s) is restored. An indirect impact may result because financial resources needed to replace or repair an asset would have been used elsewhere (opportunity cost), or owing to the cost of interrupted operations or to potential misuse of information obtained through a security breach, or because of the violation of statutory or regulatory obligations or of ethical codes of conduct.

These considerations should be reflected in the asset values. This is why asset valuation (particularly of intangible assets) is usually done through impact assessment. Thus, impact valuation is not performed separately, but is embedded within the asset valuation process.

The responsibility for identifying a suitable asset valuation scale lies with the organization. Usually, a three-value scale (low, medium, and high) or a five-value scale (negligible, low, medium, high, and very high) is used.11

Threats can be classified as deliberate or accidental. The likelihood of deliberate threats depends on the motivation, knowledge, capacity, and resources available to possible attackers and the attractiveness of assets to sophisticated attacks. On the other hand, the likelihood of accidental threats can be estimated using statistics and experience. The likelihood of these threats might also be related to the organization's proximity to sources of danger, such as major roads or rail routes, and factories dealing with dangerous material such as chemical materials or oil. Also the organization's geographical location will affect the possibility of extreme weather conditions. The likelihood of human error (one of the most common accidental threats) and equipment malfunction should also be estimated. As already noted, the responsibility for identifying a suitable threat valuation scale lies with the organization. What is important here is that the interpretation of the levels be consistent throughout the organization and clearly convey the differences between the levels to those responsible for providing input to the threat valuation process. For example, if a three-value scale is used, the value low can be interpreted to mean that it is not likely that the threat will occur; there are no incidents, statistics, or motives that indicate that this is likely to happen. The value medium can be interpreted to mean that it is possible that the threat will occur, there have been incidents in the past or statistics or other information that indicate that this or similar threats have occurred sometime before, or there is an indication that there might be some reasons for an attacker to carry out such an action. Finally, the value high can be interpreted to mean that the threat is expected to occur, there are incidents, statistics, or other information that indicate that the threat is likely to occur, or there might be strong reasons or motives for an attacker to carry out such an action.

Vulnerabilities can be related to the physical environment of the system, to the personnel, management, and administration procedures and security measures within the organization, to the business operations and service delivery, or to the hardware, software, or communications equipment and facilities. Vulnerabilities are reduced by installed security measures. The nature and extent as well as the likelihood of a threat successfully exploiting the three former classes of vulnerabilities can be estimated based on information on past incidents, on new developments and trends, and on experience. The nature and extent as well as the likelihood of a threat successfully exploiting the latter class, often termed technical vulnerabilities, can be estimated using automated vulnerability-scanning tools, security testing and evaluation, penetration testing, or code review. As in the case of threats, the responsibility for identifying a suitable vulnerability valuation scale lies with the organization. If a three-value scale is used, the value low can be interpreted to mean that the vulnerability is hard to exploit and the protection in place is good. The value medium can be interpreted to mean that the vulnerability might be exploited but some protection is in place. The value high can be interpreted to mean that it is easy to exploit the vulnerability and there is little or no protection in place.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978012803843700034X

Information Security Risk Assessments

Mark Talabis, Jason Martin, in Information Security Risk Assessment Toolkit, 2013

Information Security Risk

Now that we have a high-level definition of risk as well as an understanding of the primary components of risk, it’s time to put this all into the context of information security risk. As we mentioned at the beginning of this chapter each field or discipline has its own definition of risk because each field has their own perception of what risk is.

In information security, risk revolves around three important concepts: threats, vulnerabilities and impact (see Figure 1.4).

Which of the following factors should an IS auditor primarily focus on when determining the appropriate level of protection for an information asset?

Figure 1.4. Risk and Information Security Concepts

1.

Threat is an event, either an action or an inaction that leads to a negative or unwanted situation.

2.

Vulnerabilities are weaknesses or environmental factors that increase the probability or likelihood of the threat being successful.

3.

Impact is the outcome such as loss or potential for a loss due to the threat leveraging the vulnerability.

Sounds familiar? Of course it does. We have talked about all of this before. Figure 1.5 shows how to apply them to our risk components illustration.

Which of the following factors should an IS auditor primarily focus on when determining the appropriate level of protection for an information asset?

Figure 1.5. Illustration of an Information Security Risk Statement (Unauthorized Access)

We see that threat, vulnerability, and impact are just different interpretations of event, probability and outcome. This is important to note, as this will assist you in explaining your risk definition to other people reviewing your assessment. As we talked about earlier in this section, other people who are involved in risk management functions within your organization may not be familiar with the concepts of threats and vulnerabilities and may be more familiar with the generic risk concepts of event and probability.

As seen in Figure 1.5, we can overlay our hacker and backup tape examples to see how the components work together to illustrate a real risk statement.

In this example, the full risk statement is:

Unauthorized access by hackers through exploitation of weak access controls within the application could lead to the disclosure of sensitive data.

Our second example is illustrated in Figure 1.6.

Which of the following factors should an IS auditor primarily focus on when determining the appropriate level of protection for an information asset?

Figure 1.6. Illustration of an Information Security Risk Statement (Unencrypted Media)

For the example in Figure 1.6, the full risk statement is:

Accidental loss or theft of unencrypted backup tapes could lead to the disclosure of sensitive data.

The Real World

Jane has extensive experience in IT, particularly in application development and operations; however, she is relatively new to the information security field. She received a battlefield promotion to the role of information security officer at the financial organization she worked for (ACME Financials) after a data breach occurred. Focusing on information security she obtained her CISSP designation and built up the security program at her company by aligning with well-known information security frameworks.

Jane excelled in her position, and came to the attention of a large healthcare organization after one of the auditors of ACME Financials mentioned her to the CIO at the healthcare organization. After some aggressive recruiting the CIO convinced Jane to join the hospital system as their information security officer. Although she had limited exposure to the Healthcare Insurance Portability and Accountability Act (HIPAA) she is comfortable with working in a regulated environment as her previous organization was subject to Gram-Leach-Bliley Act (GLBA) requirements. The position is new to the hospital system and was created in response to an audit comment noted in a HIPAA audit performed by an external party.

One of the primary tasks that the CIO has for Jane is to build up the information security program. Jane is actually a little hesitant since the organization is significantly larger than her prior company; however, she is up to the challenge. Throughout this book we will keep coming back to Jane’s situation and see how risk assessments play a role in her journey to keep her new company, and frankly her new job, safe!

Let’s talk about Jane’s first day on the job. She wasn’t expecting much. Just show up at HR, get her keys, badges, and attend the new employee orientation. Basically, just ease into her new job and allow hereself to adjust and get a feel for the organization. As you well know, that seldom happens in the real world. Instead of sitting in new employee orientation the CIO of the hospital decided at the spur of the moment to ask her to speak to the IT managers, some members of the hospitals risk committee, audit department, and other select department heads of the hospitals about what she believes the organizations primary information security risks are!

Whoa! Definitely not the first day Jane was expecting. But she wasn’t going to let this rattle her. Well, she was rattled a little but she was not completely unprepared. In her prior company she had implemented her program using a risk-based approach so she was familiar with the concept of risk. She also knew that with this diverse group of people, they would probably come to the meeting with their own preset ideas on the definition of risk in the context of their specific department or field. Since it was her first day, she really didnt want to ruffle any feathers by minimizing or highlighting specific risks since she didn’t feel like she knew enough about the organizations operating environment to make that call.

With all of that in mind, instead of going up and enumerating risks from out of the air, Jane decided to start with a conciliatory note:

“Each one of us here would most likely have their own ideas of what the “primary” risks are. For example, for audit, you would probably be concerned about the possibility of a lack of compliance to HIPAA. For the department heads here, this could be the possibility that we’ll be unable to deliver service to our patients. For others, it could be a possible inability to protect our patient’s personal information. All of these are valid risks and all could produce a negative impact to our organization. But in order to answer the question of which ones are the “primary” risks to the organization, we need to start measuring risk through a documented and repeatable process. This is one of the main things that I plan to start with, a formal risk assessment process for information security. Though ultimately risk is always based on perception, a formal process will allow us to look at all the risks in a more objective manner. What I would really like to do now is go around the table and ask each of you to tell me what risks are of primary concern to your department.”

As Jane waits for a response from the group she is met with blank stares! Not one to give up, she decided to just start with the person immediately on her left and then work her way around the room, helping each of the participants to convey their risk in a structured way by utilizing her knowledge of the definitions and components of risk. For example when she was talking to the applications manager:

Jane: “What security event are you worried about?”

Application Manager: “Hmmm. Not much really. But I guess hackers might be able to get into our hospital website?”

Jane: “That’s is worth looking into. What things to do you have in place to protect from hackers?”

Applications Manager: “Hmmm. Nothing on our side. But we do have a firewall. Besides the website is just html and I don’t think they’ll be able to use anything there.”

Jane: “But they can deface the website right?”

Applications Manager: “Right. That’s true, they can deface the website by changing the files.”

CIO: “Hmmm. I think we’ll want to look more into that. That would be really embarrassing to the hospital. If people think we can’t protect our website, then how would they be comfortable that we can protect their sensitive information?”

By going around the table, Jane is beginning to see trends in the risks that the people in the room are most concerned with and equally as important is able to start identifying preconceptions that may be wrong. You’ve also probably noticed that she is doing it in a very structured way; ask for the threat, then the vulnerability, and finally the asset. It’s good to know the basics since if push comes to shove you can fall back onto basics to guide a productive conversation about risk.

By going around the room and letting other people talk, with some gentle guiding, she was able to quickly learn quite a bit about the perception of risk within her new organization. She did run into some snags, one of the attendees was adamant that the risk assessment could be done in a day and was under the impression that the meeting they were having was the risk assessment, not understanding why the process would actually take some time and require meetings with multiple groups. Now the meeting was probably not what Jane’s CIO was expecting but hey, it’s her first day and she knows she is going to educate her new boss as much, or probably even more, than anyone else in the organization. Although done indirectly, Jane was able to convey that one person cannot identify all risks alone since different perspectives are needed and that this would ultimately be an organizational effort. She also demonstrated her knowledge of the concept of risk and used that knowledge to create a structured information gathering approach for questioning the meeting participants.

All in all, not a bad first day for our information security officer!

Now that we have covered defining Risk and it’s components, we will now delve deeper into the background, purpose, and objectives of an information security risk assessment.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597497350000014

Information Security Risk Assessment: Reporting

Mark Talabis, Jason Martin, in Information Security Risk Assessment Toolkit, 2012

Introduction

In an information security risk assessment, the compilation of all your results into the final information security risk assessment report is often as important as all the fieldwork that the assessor has performed. Some would even argue that it is the most important part of the risk assessment process. This is due to the fact that the final report and related derivative information (e.g. slide decks or summary memos) are the only deliverables that the stakeholders will see. It is essential to the credibility of your entire process that the final report accurately captures all the results and reflects all the time and effort that was put into the process.

Having a cohesive final report will allow the assessor to communicate findings clearly to the stakeholders, allowing them to understand how the findings were identified and ultimately, allow them to “buy” into the process enough to support action plans and remediation activities. A poorly written or structured report can bring into question the credibility of the assessor and ultimately invalidate much of the work that was performed.

This chapter is presented differently from the other chapters up to this point. What we will be providing in this chapter is a report template that an assessor can use in putting together a final information security risk assessment report. In presenting the template, we will be providing an outline first then we will go through each section of the outline. For each section, we will be providing sample content taken from the hypothetical scenarios that we discussed throughout the different chapters of this book.

Note that with all reports; you need to be cognizant of who the reader may be. In many cases the readers of the report, or information derived from the report, could be anyone from executives of the company to system administrators within IT. The sample report presented in this section is structured to allow the executives to gain sufficient information from the executive summary while detailed risk and mitigation discussions are covered in the detail of the report to allow those tasked with addressing risk to have a clear understanding of what was found.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597497350000075

Modeling Information Security Risk

Carl S. Young, in Information Security Science, 2016

Summary

A model for information security risk specifies the dependence of a security parameter on one or more risk factors. Models are useful in making generalizations regarding the behavior of security/threat parameters as a function of risk factors, which can enable estimates of vulnerability.

Specific mathematical functions and concepts are useful in developing simple information security models. Logarithmic functions, exponents and exponential growth, logistic growth, and elementary solid geometry facilitate quantitative risk models, and in particular an understanding of risk factor dependencies. Decibels are expressed as logarithms, and are useful in presenting data that span many orders of magnitude.

Linearity and nonlinearity are essential to the concept of scaling, which compactly expresses the quantitative relationship between security/threat parameters and risk factors as specified in a model. The concept of density has direct application to estimates of vulnerability. In particular, signal intensity or power per unit area is a density measurement that occurs frequently in information security risk assessments.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128096437000024

Information Security Risk Assessment: Data Collection

Mark Talabis, Jason Martin, in Information Security Risk Assessment Toolkit, 2013

Introduction

The cornerstone of an effective information security risk assessment is data. Without data to support an assessment there is very little value to the risk assessment and the assessment you perform can be construed as mere guesswork.

Data collection is by far the most rigorous and most encompassing activity in an information security risk assessment project. There are many factors that affect the success of the data collection phase; however, the single most important factor is planning. Since all of the subsequent phases of the assessment will rely on the information gathered in this phase, not properly planning the data collection phase will have significant repercussions. This phase is also one where you will have to coordinate with people throughout your organization, so effective and appropriate communications are an essential element. We emphasize the word appropriateness in your communications since providing too much or too little information may impair your ability to effectively interact with the individuals or groups that you will rely on for data collection.

Throughout this chapter, we will also be highlighting several critical success factors that you should be trying to ensure are in place within your organization. This includes identifying a strong executive sponsor or sponsors, regular follow-ups with all involved groups, building strong relationships with system owners and contacts, proper asset scoping, leveraging automated data collection mechanisms, identifying key people with strong organizational knowledge, and use of a standard control framework. The existence of these and other factors will be good predicators of how successful your data collection phase will be.

The main output for this phase is a data container with relevant information about the organization, environment, systems, people, and controls that will be used in the various analyses throughout the project. Depending on the size of the organization, the number of assets, and support from the organization, this phase may take a few weeks or several months.

A sample Gantt chart enumerating the data collection activities is provided in the companion website of this book.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781597497350000038

What are the key responsibilities for protection of information assets?

CISA Domain 5 – Protection of Information Assets.
Importance of Information Security Management. ... .
Logical Access. ... .
Network Infrastructure Security. ... .
Auditing Information Security Management Framework. ... .
Auditing Network Infrastructure Security. ... .
Environmental Exposures and Controls. ... .
Physical Access Exposures and Controls..

What is the primary consideration for an IS auditor while reviewing the prioritization and coordination of IT projects and program management?

The MOST important point of consideration for an IS auditor while reviewing an enterprise's project portfolio is that it: is aligned with the business plan.

Which of the following is the first step in selecting the appropriate controls to be implemented in a new business application?

CH2: Which of the following is the FIRST step in selecting the appropriate controls to be implemented in a new business application? D. It is necessary to first consider the risk and determine whether it is acceptable to the organization. Risk assessment can identify threats and vulnerabilities and calculate the risk.

Which of the following is the most critical step when planning an IS audit?

Explanation: In planning an audit, the most critical step is identifying the areas of high risk.