Cyber Risk, Security, Economics and Insurance

Below are some notes when looking at cyber risk, security, economics and insurance. First a definition.

What is Security?

Security

A system is secure if it is protected against all forms of threat

  • This is hard to predict
  • And cannot afford to protect against all of them
  • Security = economic issue, not just engineering
  • Solutions require cost/effort; may not be worth it
  • Every level of protection is based on cost

Our ability to protect systems is constrained by:

  • Budget
  • Technology
  • Time
  • Law
  • Personnel
  • Ethics & Practice

Risk (opposite of trust)

  • Because we cannot have perfect security, we need to understand the factors that may lead to problems and address them accordingly.
  • This is commonly understood as understanding and addressing risk:
    • Avoidance
    • Recovery (ex backups)
    • Risk shifting (pay someone else to take risk, insurance)

Risk to what?

  • Confidentiality – info is not disclosed to unauth entity
  • Integrity – info is protected against unauth creation, modification, deletion
  • Availability
  • Accountability
  • Business continuity
  • Reputation

General Risk Mitigation Priciples

  • Least Privilege (LP)
    • Only access to what is needed, “need to know”
  • Defense in Depth (Did)
    • Multiple layers of security (layers)
  • Separation of Duties (SoD)
    • No one person has authority to all security critical functions

What determines risk?

  • Vulnerabilities
    • Using weak password, old software, lack of security patching.. 
  • Threats
    • Breach of private data, theft of IP, denial of service…
  • Consequences
    • Damage to reputation, legal consequence, financial consequence, government fines…

When is risk analysis done?

  • Whenever money is to be spent or resources committed
    • To make best use of money and resources
  • Whenever three are changes in organization’s systems, policies, incentives (remuneration), operations, equipment, connectivity
    • To assess their impacts on risk, doesnt invalidate underyling security

Who does risk analysis?

  • Usually internal experts
    • Knows organization systems and applications
    • Fear that external experts may leak info on weaknesses (or misuse or sell them)
  • They dont do it in isolation
    • They involve whole organization

How long does a risk analysis take? Who sees it?

  • Days (relatively quickly)
    • Not weeks
    • Try to minimize impact on employee schedules
  • Who gest to review its results
    • Senior managers who sponsored it, anyone who has “need to know”
    • No one else (results are usually considered confidential)

Limits

Coping with risk is constrained

  • Economics limits what we can afford to do
  • Technical limits prevent us from avoiding or recovering from some events
  • Biases lead to some paradoxical choices
  • Laws prevent us from some some possible mitigations

Security is attempting to optimize dynamic risk management within constraints

Sources of risk include (see section below on who do you trust)

  • Buggy software
    • Poor specs, faulty implementation
  • Misused or misconfigured software / hardware
  • Malicious software
  • Hardware
    • Over writing magnetic HD with 0 everywhere and multiple times fails to erase all
    • Fault sectors that get bypassed over time
  • People
    • Errors
    • Deliberate misuse
  • Examples
    • Air traffic control impersonators
    • Attempted sabotage of encyclopedia
    • Spy within organization

Mitigation of the people risk

  • Separation of Duties (SoD), least privilege, need to know
  • Background checks
  • Departure policies
  • Monitor for anomalous behavior
  • Employee education
  • Reward system

More Sources of risk

  • Management (faulty policies, procedures)
  • Eavesdropping (both physical and electronic)
  • Physical (often ignored)
    • Fire, earthquake, water, etc
    • 25% of FAA centers down for 6 hours due to farmer burying a dead cow (cut cable)
    • 1986 Arpanet outage in New England (severed cables)
  • Legal Risk
    • Lawsuits
  • Regulatory risks
    • Regulators can impse large fines, force expensive/disruptive changes
  • Implicit assumptions
    • About users, operations, trust
  • Supply chan (3rd party suppliers)

Risk Mitigation

Good risk mitigation is grounded in having a comprehensive security plan, appropriate resources, trained personnel and regular monitoring and updates.

NIST 800-39 and NIST Cybersecurity Framework provide starting points from comprehensive plan. Other NIST 800-series publications expand on risk analysis and cyber security topics

  • Static audit tools
    • Scan system and report weakness
  • Password checkers
  • Integrity checkers
  • Network Scanners
  • Audit logs
    • Detect violations, recovery, measure resource utilization
  • Firewalls
  • Intrusion and misuse detection systems
    • Signature based
    • Anomaly based
  • Integrity management
    • Immutable files
    • Append only files
    • Change detection
  • Crypto
    • Encryption, signed code
  • Fault tolerance
    • Redundancy, replication
    • RAID disks
  • Error recovery
    • Backward: rollback to an earlier state
    • Forward: retry, keep going
  • Backups
    • Frequency
    • Problem = may contain data that shouldve been deleted
  • Minimize services offered by web server
    • Disable all unnecessary services on servers
    • Minimize attack surface
  • Limit logins to servers
    • No remote access
  • Require use of secure tools (disable others)
  • Avoid historically trouble-prone software
  • Search for dangerous accounts
    • Default password
    • Dormant account, historical account
  • Use uninterrupted power supply
  • Use aliases
  • Watch for unusual usage
    • Too many crypt()

Risk = f (Threat, Vulnerability, Consequence)

  • Which function f()?
    • Often used is product:
    • Risk = Threat * Vulnerability * Consequence
  • This product formula is often critized
    • Subjectivity and ambiguity of the quantifications used for Threat, Vulnerability, Consequence
    • Failure to adjust for correlations
    • Failure to capture how adversaries adapt based on their information and experience

Typical usage of scores

  • Risk scores are used to guide the funding
  • Greedy heuristic: Higher scores get priority
    • An ok approach if risks are uncorrelated
    • But correlations are pervasive in cyber security
  • Even when no correlations, its suboptimal
    • Priority scores fail to capture optimal strategies
  • Correlations can make greedy very suboptimal
    • Especially if risk mitigation budgets are limited
    • It can be worse than random, because ignoring correlations can reverse risk score rankings
  • Even though better alternatives exist, greedy is in widespread use
    • In risk management software, ISO standards, laws and regulations, textbook

 

Who do you trust?

When it comes to technology and cyber security we need to understand who we trust. For example, which of these people can be trusted:

  • The supplier/vendor
    • Did their developers follow best practices
    • Did they properly test, quality check
    • How do they respond to findings/bugs
  • Employees
  • System Administrator
  • Consultants
  • Leadership

An example perspective, the what-if scenario:

The “What-If”Scenario
What the Attacker Might Do After Gaining Root Access Your Responses
The attacker plants a back door in the /bin/login program to allow unauthorized access. You use PGP to create a digital signature of

all system programs. You check the signatures every day.

The attacker modifies the version of PGP that you are using, so that it will report that the signature on /bin/login verifies, even if it doesn’t. You copy /bin/login onto another computer before verifying it with a trusted copy of PGP.
The attacker modifies your computer’s kernel by adding loadable modules, so that when the /bin/loginis sent through a TCP connection, the original /bin/login, rather than the modified version, is sent. You put a copy of PGP on a removable hard disk. You mount the hard disk to perform the signature verification and then unmount it. Furthermore, you put a good copy of /bin/login onto your removable hard disk and then copy the good program over the installed version on a regular basis.
The attacker regains control of your system and further modifies the kernel so that the modification to /bin/login is patched into the running program after it loads. Any attempt to read the contents of the /bin/login file results in the original, unmodified version. You reinstall the entire system software, and configure the system to boot from a read-only device such as a CD-ROM.
Because the system now boots from a CD-ROM, you cannot easily update system software as bugs are discovered. The attacker waits for a bug to crop up in one of your installed programs, such as sendmail . When the bug is reported, the attacker will be ready to pounce. Your move . . .

 

Economics

Classical Economics (physical goods)

  • The price at equilibrium is marginal cost of production
  • The price of something X = break even cost, anything less than X would be loss and unable to produce

Information Economics (digital goods)

  • The marginal cost of producing an extra copy is so close to zero, its considered 0
  • Example:
    • Encyclopedia Britannica (32 volumes) $1600
    • Microsoft Encarta $50
    • Wikipedia FREE
  • Can give away for free, but make money through different way as part of that (ie ads)

The Economics of Information Security

  • Systems are particularly prone to failure when the person guarding them is not the person who suffers when they fail
  • The tools and concepts of game theory and microeconomic theory are becoming just as important to the security engineer as the mathematics of cryptography.
  • The difficulty in measuring information security risks presents another challenge: these risks cannot be managed better until they can be measured better.
  • DRM, which stands for Digital Rights Management, is a method of securing digital content to prevent unauthorized use and piracy of digital media. This mechanism prevents users from copying, redistributing, or converting content in a way that is not explicitly authorized by the content provider.
  • People did not spend as much on protecting their computers as they might have
  • In economic theory, a hidden action problem arises when two parties wish to transact, but one party can take unobservable actions that impact the transaction.
  • diversity (with each node storing its preferred resource mix) performs better under attack than solidarity (where each node stores the same resource mix, which is not usually its preference). Diversity increases node utility which in turn makes nodes willing to allocate higher defense budgets.
  • Economists call this a network externality: a network, or a community of software users, is more valuable to its members the larger it is. (EX: WIndows OS)
  • A number of core Internet protocols, such as DNS and routing, are considered insecure. More secure protocols exist; the challenge is to get them adopted
  • software vendors and security researchers over whether actively seeking and disclosing vulnerabilities is socially desirable. argued against disclosure and frequent patching if vulnerabilities are correlated.
  • case: most commercial software contains design and implementation flaws that could easily have been prevented. Although vendors are capable of creating more secure software, the economics of the software industry provide them with little incentive to do so
  • Vendors may make claims about the security of their products, but buyers have no reason to trust them. In many cases, even the vendor does not know how secure its software is. So buyers have no reason to pay more for more secure software, and vendors are disinclined to invest in protection.
  • One criticism of market-based approaches is that they might increase the number of identified vulnerabilities by compensating people who would otherwise not search for flaws. Thus, some care must be exercised in designing them.

Open Source Developers Still Not Interested in Secure Coding

  • FOSS developers resistant to spending time fixing vulnerabilities
  • FOSS 2.3% developers spend time improve security of code
  • “Shifting left” not pervaded in FOSS
  • Companies must implement security checks/procedures
  • Some of this driven by media coverage of vulnerabilities
  • As we see an increasing number of companies actively paying their employees to work on FOSS projects, these employers should incentivize their employees to both write secure code from the beginning, and also spend some time helping find and address existing security vulnerabilities
  • The Open Software Security Foundation recommended that organizations who pay employees to contribute to open source projects should also contribute to security audits and have those employees rewrite portions or components of those libraries. Part of such a rewrite could be to switch to a memory-safe language, the FOSS Contributor Survey report said.

Open-source developers say securing their code is a soul-withering waste of time

  • 1200 FOSS contributors found security low on priorities
  • 2.27% contributors spent time on security issues
  • Of the 1,196 survey respondents, 91% reported being male and between 25 and 44 years old.
  • Developers generally do not want to become security auditors; they want to receive the results of audits.
  • One way to improve a rewrite’s security is to switch from memory-unsafe languages (such as C or C++) into memory-safe languages (such as nearly all other languages)
  • Nearly half (48.7%) said they were paid by their employer for time spent on open-source contributions, while 44.02% said they were not paid for any other reason.
  • Understanding FOSS contributor motivations and behavior is a key piece of ensuring the future security and sustainability of this critical infrastructure.

 

Network externalities

  • When value of a network grows super-linearly in the number of N users
  • Examples
    • Fax machine use gew exponentially because business would buy one so N other business had them
    • N buyers/sellers on an auction site makes that auction site grow
    • N developers help grow an OS by writing software for it

Lock-in

  • Using a platform can lead to ‘locked in’ because the cost of switching is too high
  • Examples
    • New hardware/software
    • Converting files
    • Retraining staff
    • Loss of inter-operability 
  • Can apply for databases, phone exchanges, online services, ISPs, etc

Another consequence of high lock-in

  • When lock-in is expensive, users are more careful about their purchase
  • Users want to be compatible
    • With other users
    • With other vendors
  • Therefore users generally want to buy dominant vendor, or one they expect to be dominant

Major Problem: Wrong incentives

  • Those making important decisions affecting security are not the ones who suffer when security is breached
    • Designed to shift the blame.. Example:
      • CIO vs CISO, if CISO take blame then CIO may not focus
  • Decisions affected:
    • How much to spend on security
    • Integrating security into operations
    • Testing for security weaknesses (red teaming)

Market Failure: Information Asymmetry

  • Some principals know more than others, that knowledge tips market
  • Example
    • 100 used cars, half are good, half are lemons
    • If seller knows which are lemons, they can control the price – tehy can let lemons dominate the selling lot
  • Another example
    • If $50 security product is seen no different than $100, user would buy $50
    • If market price is $50, no higher quality would be available as money looser
    • This adverse selection is why there are a lot of poor security products

Signaling

  • One party sends a signal to the other
    • Can mitigate information asymmetry
  • Example
    • In the used car example before, if info about the seller or car maintenance records or 3rd party assurances, these can factor uncertainty. That would help the seller also get a better price since higher quality
  • College Degree is a form of Signalling
    • Employers want higher capable person
    • Degree from credible signal achieves some of that

Open Source

  • More transparency
  • Weak excludability – harder to get paid
    • Or pay for support, extra features, etc.
    • Example: Java language and how Sun/Oracle makes money off of it via their support services
  • Non-monetary rewards – kudos, status, reputation (from previous lecture, a form of signal one’s high level of competence)

Moral Hazard

  • X makes decision about how much risk to take; Y bears cost of things going wrong
    • Example: Y = insurance company
      • X may behave recklessly because Y bears cost
      • Y cannot observe reckless behavior of X
    • Example: Banking
      • FDIC covering $250k per person = from govt taking risk for bank
      • 2008 Recession
    • Example: Solarwinds
      • Didnt put attention on security, focus on profits
      • Made them vulnerable to attack
      • The customer ends up paying for it
    • Example: Software – not always clear who is accountable
      • Faulty specs? Implementation? Testing? User?
      • Failures take time to show, though rewards are immediate
        • Investing in company that will fail
        • Executives keep salaries/bonuses of years past

Misaligned Incentives

  • Example: Alice has a budget to buy some equipment E. She is rewarded if she spends less than budget. Her bonus is proportional to the unspent amount 
    • This encourages perhaps lower quality product E
    • Rewards the wrong thing

Weakest Link vs Sum of Efforts

  • Example Weakest Link: Software Program correction
    • If careless programmer is what introduces vulnerability, then the risk increases with more programmers
  • Example of efforts: Software testing
    • Put more effort on testing, risk decreases with the number of testers
  • Best Answer: hire fewer better programmers and many testers

Confusopoly

  • Confusion + oligopoly
  • Group of sellers with nearly identical product/service want to avoid competition so they provide many versions with different prices
    • Example: sauces
    • Example: beer, wine
    • Generic brand vs known brand
  • More Examples:
    • Mobile plans: minutes, text, bandwidth, data, family, roaming, etc
  • All this helps seller but confuses the customer

Challenge of justifying costs of security

  • Security costs are up-front, benefits are long term
  • Short term often wins
    • NIMTO = Not in my term of office
    • Rewards are often short term
  • Longer term security deficiencies can come back and harder to overcome
  • Costs of security are concentrated on a few, its benefits are diffused among many
    • When costs of X are concentrated on few and benefits diffused to many, X tends not to happen
    • When costs of X are diffused to many and benefits concentrated to few, X tends to happen
  • Success of security is hidden
    • Enjoyed by many who did not spend on security
  • Example Y2K
    • A lot invested in preparing
    • Event had little impact
    • Afterwards people question why all that investment, nothing happened. Lack of connection that the investment is result of nothing happened.
    • Must have metrics to address this

 

Insurance

Dealing with Risk

  • Risk Avoidance
  • Mitigation or Prevention
  • Risk Transfer (insurance)
  • Risk Retention (accept it)

Insurance

  • Insurance is sharing risk and spreading it across multiple victims
  • Insurance is form of stochastic capitalization of risk across time and space
    • Using probability
  • Basic Steps
    • Collect evidence
    • Apply actuarial methods and predictors
    • Price expected loss over time
    • Charge premiums
    • Work to reduce exposure and loss, proactively and relatively

Underwriting

  • The entity that takes the risk
  • Underwriters pledge resources against claims based on statistical models
  • Underwriters make profits when claims are not made
    • Interest on investments also pay profits
  • Some risks are catastrophic and liquidate underwriters
    • Governments may also backstop some losses 
    • Example: after 911 USGOV help cover losses of the buildings

Risk Assessment

  • Underwriters are often unwilling to risk insurance for events of unknown
    • Probability of occurrence
    • Scope of damage
    • Unknown duration
    • Unknown provenance
    • Little historical consistency
  • Some damages and causes are explicitly disclaimed for this reason (e.g., acts of war)

Incentivization for the Insured

  • Insurers compete for business
  • Premiums can only be lowered so far before underwriters get nervous
  • Competition continues when 
    • The insured can be incentivized to reduce losses
    • Subclasses with lower risk can be identified for reduced premiums
    • Exclusions/limits palatable to customers can be included 

Why cyber-insurance

  • Traditional insurance policies exclude cyber-risks or do not mention them
  • Cyber-insurance is needed for:
    • Coverage of losses incurred by the insured
      • Data loss (both accidental and malicious), damage from cyber-attacks (break-ins, denials of service, …etc)
    • Coverage of liability for losses incurred by others
      • Such as failure to protect others’ data, or reputational damage to others (including defamation)

The present in cyber security insurance

  • ~50 insurance companies offer cyber-insurance (but most contracts are underwritten by about 5)
    • Many more are reluctant to enter the market
  • Tens of different types of cyber-insurance
    • About a dozen new types are created each year
  • Growth boosted by current/planned U.S. & E.U. regulations, with provisions for huge fines
  • – 2016 saw 50% market growth, 80% claims growth
  • Ransomware accounts for over 40% of claims
    • Was only 5% a few years ago
    • Extortion amounts are ~ $50K to $2M
  • Lack of standardization of insurance contracts makes it difficult to compare competing ones
  • Buying cyber-insurance => having to undergo a series of invasive security evaluations
    • This may be putting a damper on demand
    • Premiums vs. improvements not economic

Many companies had insurance. Insurance companies claim this was ”an act of war” and will not pay damages. War damages are a disclaimer

 

eof