[Log In] []

Exploring the science and magic of Identity and Access Management

No other success in life can compensate for failure in the home. — David O. McKay

Sunday, July 5, 2015

Bots Generate a Majority of Internet Traffic

Information Security
Author: Mark Dixon
Friday, May 22, 2015
11:16 am

Bot1

According to the 2015 Bad Bot Landscape report, published by Distil Networks, only 40% of Internet traffic is generated by humans! Good bots (e.g. Googlebot and Bingbot for search engines) account for 36% or traffic, while bad bots account for 23%.

Bad bots continue to place a huge tax on IT security and web infrastructure teams across the globe. The variety, volume and sophistication of today’s bots wreak havoc across online operations big and small. They’re the key culprits behind web scraping, brute force attacks, competitive data mining, brownouts, account hijacking, unauthorized vulnerability scans, spam, man-inthe- middle attacks, and click fraud.

These are just averages. It’s much worse for some big players.

Bad bots made up 78% of Amazon’s 2014 traffic, not a huge difference from 2013. VerizonBusiness really cleaned up its act, cutting its bad bot traffic by 54% in 2014.

It was surprising to me that the US is the largest source for bad bot traffic.

The United States, with thousands of cheap hosts, dominates the rankings in bad bot origination. Taken in isolation, absolute bad bot volume data can be somewhat misleading. Measuring bad bots per online user yields acountry’s “Bad Bot GDP.”

Using this latter “bad bots per online user” statistic, the nations of Singapore, Israel, Slovenia and Maldives are the biggest culprits.

The report contains more great information for those who are interested in bots. Enjoy!

 

Turing Test (Reversed)

Information Security
Author: Mark Dixon
Tuesday, May 19, 2015
3:13 pm

Turing1

The classic Turing Test, according to Wikipedia, is:

a test of a machine’s ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. Alan Turing proposed that a human evaluator would judge natural language conversations between a human and a machine that is designed to generate human-like responses. …

The test was introduced by Turing in his 1950 paper “Computing Machinery and Intelligence.” …

As illustrated in the first diagram:

The “standard interpretation” of the Turing Test, in which player C, the interrogator, is given the task of trying to determine which player – A or B – is a computer and which is a human. The interrogator is limited to using the responses to written questions to make the determination. …

In the years since 1950, the test has proven to be both highly influential and widely criticised, and it is an essential concept in the philosophy of artificial intelligence.

Turing2

What if the roles were reversed, and a computer was tasked with determining which of the entities on the other side of the wall was a human and which was a computer?  Such is the challenge for software that needs to decide which requests made to an online commerce system are generated by humans typing on a browser, and which are illicit bots imitating humans.

By one year-old estimate, ”more than 61 percent of all Web traffic is now generated by bots, a 21 percent increase over 2012.” Computers must automatically determine which requests come from people and which come from bots, as illustrated in the second diagram.

While this is not strictly a Turing test, it has some similar characteristics.  The computer below the line doesn’t know ahead of time what techniques the bots will use to imitate human interaction. These decisions need to be made in real time and be accurate enough to prevent illicit bots from penetrating the system. A number of companies offer products or services that accomplish this task.

One might ask, “Does this process of successfully choosing between human and bot constitute artificial intelligence?”

At the current state of the art, I think not, but it is area where enhanced computer intelligence could provide real value.

 

Security: Complexity and Simplicity

Information Security
Author: Mark Dixon
Monday, May 18, 2015
4:48 pm

Leobruce

It is quite well documented that Bruce Schneier stated that “Complexity is the worst enemy of security.

As a consumer, I think this complexity is great. There are more choices, more options, more things I can do. As a security professional, I think it’s terrifying. Complexity is the worst enemy of security.  (Crypto-Gram newsletter, March 15, 2000)

Leonardo da Vinci is widely credited with the the statement, “Simplicity is the ultimate sophistication,” although there is some doubt whether he actually said those words.

Both statements have strong implications for information security today.

In the March, 2000 newsletter, Bruce Schneier suggested five reasons why security challenges rise as complexity increases:

  1. Security bugs.  All software has bugs. As complexity rises, the number of bugs goes up.
  2. Modularity of complex systems.  Complex systems are necessarily modular; security often fails where modules interact.
  3. Increased testing requirements. The number of errors and difficulty of evaluation grown rapidly as complexity increases.
  4. Complex systems are difficult to understand. Understanding becomes more difficult as the number of components and system options increase.
  5. Security analysis is more difficult. Everything is more complicated – the specification, the design, the implementation, the use, etc.

In his February 2015 article, “Is Complexity the Downfall of IT Security,”  Jeff Clarke suggested some other reasons:

  1. More people involved. As a security solution becomes more complex, you’ll need more people to implement and maintain it. 
  2. More countermeasures. Firewalls, intrusion-detection systems, malware detectors and on and on. How do all these elements work together to protect a network without impairing its performance? 
  3. More attacks. Even if you secure your system against every known avenue of attack, tomorrow some enterprising hacker will find a new exploit. 
  4. More automation. Removing people from the loop can solve some problems, but like a redundancy-management system in the context of reliability, doing so adds another layer of complexity.

And of, course, we need to consider the enormous scale of this complexity.  Cisco has predicted that 50 billion devices will be connected to the Internet by 2020.  Every interconnection in that huge web of devices represents an attack surface.

How in the world can we cope? Perhaps we need to apply Leonardo’s simplicity principle.

I think Bruce Schneier’s advice provides a framework for simplification:

  1. Resilience. If nonlinear, tightly coupled complex systems are more dangerous and insecure, then the solution is to move toward more linear and loosely coupled systems. This might mean simplifying procedures or reducing dependencies or adding ways for a subsystem to fail gracefully without taking the rest of the system down with it.  A good example of a loosely coupled system is the air traffic control system. It’s very complex, but individual failures don’t cause catastrophic failures elsewhere. Even when a malicious insider deliberately took out an air traffic control tower in Chicago, all the planes landed safely. Yes, there were traffic disruptions, but they were isolated in both time and space.
  2. Prevention, Detection and Response. Security is a combination of prevention, detection, and response. All three are required, and none of them are perfect. As long as we recognize that — and build our systems with that in mind — we’ll be OK.This is no different from security in any other realm. A motivated, funded, and skilled burglar will always be able to get into your house. A motivated, funded, and skilled murderer will always be able to kill you. These are realities that we’ve lived with for thousands of years, and they’re not going to change soon. What is changing in IT security is response. We’re all going to have to get better about IT incident response because there will always be successful intrusions.

But a final thought from Bruce is very appropriate. “In security, the devil is in the details, and those details matter a lot.”

 

KuppingerDole: 8 Fundamentals for Digital Risk Mitigation

Identity, Information Security
Author: Mark Dixon
Tuesday, May 5, 2015
1:45 pm

Mk

Martin Kuppinger, founder and Principal Analyst at KuppingerCole recently spoke in his keynote presentation at the European Identity & Cloud Conference about how IT has to transform and how Information Security can become a business enabler for the Digital Transformation of Business

He presented eight “Fundamentals for Digital Risk Mitigation” 

  1. Digital Transformation affects every organization 
  2. Digital Transformation is here to stay
  3. Digital Transformation is more than just Internet of Things (IoT) 
  4. Digital Transformation mandates Organizational Change
  5. Everything & Everyone becomes connected 
  6. Security and Safety is not a dichotomy 
  7. Security is a risk and an opportunity 
  8. Identity is the glue and access control is what companies need

I particularly like his statements about security being both risk and opportunity and that “Identity is the glue” that holds things together.

Wish I could have been there to hear it in person.

 

Version 2015 Data Breach Investigations Report

Information Security
Author: Mark Dixon
Wednesday, April 15, 2015
8:25 pm

Verizon2015DBIR

The new Verizon 2015 Data Breach Investigations Report has been published.

It is interesting to note … 

The year 2014 saw the term “data breach” become part of the broader public vernacular, with The New York Times devoting more than 700 articles related to data breaches, versus fewer than 125 the previous year.

And there are undoubtedly more to come. Consider one of the scariest charts in the report:

[The chart] contrasts how often attackers are able to compromise a victim in days or less (orange line) with how often defenders detect compromises within that same time frame (teal line). Unfortunately, the proportion of breaches discovered within days still falls well below that of time to compromise. Even worse, the two lines are diverging over the last decade, indicating a growing “detection deficit” between attackers and defenders.”

VerizonChart01

Enjoy the read! We in the information security industry have a lot of work to do.

 

Earl Perkins: The Identity of Things for the Internet of Things

Identity, Information Security, Internet of Things
Author: Mark Dixon
Wednesday, December 3, 2014
11:20 am

Earl Perkings, Gartner

Yesterday, at the Gartner Identity and Access Management SummitEarl Perkins, Gartner’s Research Vice President in Systems, Security and Risk, gave a thought-provoking talk, proposing that Identity and Access Management as it is today is not going to cut it for the Internet of Things. Some the highlights include (filtered through the lens of my interpretation):

  • IoT can be described as as set of devices that can sense and interact with the world around it. Such devices can sense, analyze, act and communicate.
  • Devices, services and applications are creators or consumers of information, and must join humans in having identities.
  • Architectural concepts of IAM may still hold, but the scale will be vastly larger and must accommodate more than human identities.
  • Perhaps the word “thing” should be replaced by the term “entity”
  • Every entity has an identity
  • We need a model of entities and relationships between these entities.
  • We must address layered hierarchies of identities.
  • We should not separate device management and identity management systems.
  • Identity Management and Asset Management systems will likely converge.
  • Identity and Access Management may become:
    • Entity Relationship Management
    • Entity Access Management
  • We may think of architectures in four levels: things, gateways/controllers, connectivity, applications and analytics.
  • Two major camps of consumption: Enterprise (where more money is currently being spent) and Consumer (which is hot and sexy, but not currently making much money).
  • Strong year-over-year IoT growth is happening in four industry sectors:
    • Automotive – 67% CAGR
    • Consumer – 32% CAGR
    • Vertical specific – 24% CAGR
    • Generic business – 44% CAGR
  • Companies are “throwing jello against the wall” to see what sticks.

I really like Earl’s ideas about convergence of “entities” and “relationships” between entities.  Please note my blog post Identity Relationship Diagrams  posted in March 2013.

I also favor his view that identity management should not be separate from device management.

It will be interesting to see how architectures are transformed and what “jello sticks to the wall” in the coming years.

 

McAfee Labs Threats Report – Fourth Quarter 2013

Information Security
Author: Mark Dixon
Friday, March 21, 2014
8:05 am

This morning, I read the recently-released McAfee Labs Threats Report – Fourth Quarter 2013   The lead story was entitled “The cybercrime industry and its role in POS attacks.”  To introduce a timeline chart that includes successful attacks on well known retailers, the report states:

In December, we began to hear of a series of point-of-sale (POS) attacks on multiple retail chains across the United States. The first story to break was specific to Target; this attack has been ranked among the largest data-loss incidents of all time. Soon we learned of more retail chains affected by POS attacks. Neiman Marcus, White Lodging, Harbor Freight Tools, Easton-Bell Sports, Michaels Stores, and ‘wichcraft all suffered similar POS breaches in 2013. Although there has been no public acknowledgment that the attacks are related or carried out by the same actor, many of them leveraged off-the-shelf malware to execute the attacks.

McAfee

Two themes in the article particularly stood out:

  • Many attacks leveraged “off-the-shelf malware”
  • The attacks were executed by a “healthy and growing cybercrime industry”

The article concluded:

We believe these breaches will have long-lasting repercussions. We expect to see changes to security approaches and compliance mandates and, of course, lawsuits. But the big lesson is that we face a healthy and growing cybercrime industry which played a key role in enabling and monetizing the results of these attacks.

Intruders are better prepared, more organized and better equipped than ever.  It’s a crazy world out there.  

 

KuppingerCole: Information Security Predictions and Recommendations 2014

Cloud Computing, Identity, Information Security, Internet of Things
Author: Mark Dixon
Thursday, December 19, 2013
2:53 pm

Kuppingercole

Kuppinger Cole just released an insightful Advisory Note: “Information Security Predictions and Recommendations 2014.”  The introduction stated:

Information Security is in constant flux. With the changing threat landscape, as well as a steary stream of new innovations, demand for Information Security solutions is both growing and re-focusing.

I like both the predictions and recommendations in this report.  Here are a few excerpts from my favorite recommendations:

Cloud IAM (Identity and Access Management)

Define an IAM strategy for dealing with all types of users, devices, and deployment models that integrates new Cloud IAM solutions and existing on-premise IAM seamlessly.

API Economy

Before entering this brave, new world of the API “Economy”, define your security concept first and invest in API Security solutions. Security can’t be an afterthought in this critical area.

IoEE (Internet of Everything and Everyone)

Before starting with IoEE, start with IoEE security. IoEE requires new security concepts, beyond traditional and limited approaches.

Ubiquitous Encryption

Encryption only helps when it is done consistently, without leaving severe gaps.

The whole paper is well worth reading.  Hopefully, this post whetted your appetite a little bit.

 

$1,000 per Record?

Information Security, Privacy
Author: Mark Dixon
Tuesday, November 19, 2013
5:49 pm

One Thousand Dollars

Today, I read of at three separate instances where class-action lawsuits have been filed on behalf of people whose personal information had been breached at a healthcare company.  The largest lawsuit, filed against TRICARE, represents 4.9 million affected individuals and is seeking damages of $1,000 per record – a total of $4.9 BILLION. Wow!

This action or other similar lawsuits have yet to be reach court or settlement. Depending on the outcomes, potential costs of litigation and resulting awards to victims may emerge as the single most powerful financial driver to implement good information security in the healthcare industry. 

 

Video: Ann Cavoukian – Privacy and Security by Design: An Enterprise Architecture Approach

Information Security, Privacy
Author: Mark Dixon
Wednesday, November 6, 2013
4:17 pm

The following video features Ann Cavoukian, Ph.D., Information and Privacy Commissioner, Ontario, Canada, discussing the paper I co-authored with her, “Privacy and Security by Design: An Enterprise Architecture Approach.”

 
 
 
 
 
Copyright © 2005-2013, Mark G. Dixon. All Rights Reserved.
Powered by WordPress.