[Log In] []

Exploring the science and magic of Identity and Access Management

Love begins at home, and it is not how much we do … but how much love we put in that action. — Mother Teresa

Monday, May 25, 2015

Bots Generate a Majority of Internet Traffic

Information Security
Author: Mark Dixon
Friday, May 22, 2015
11:16 am

Bot1

According to the 2015 Bad Bot Landscape report, published by Distil Networks, only 40% of Internet traffic is generated by humans! Good bots (e.g. Googlebot and Bingbot for search engines) account for 36% or traffic, while bad bots account for 23%.

Bad bots continue to place a huge tax on IT security and web infrastructure teams across the globe. The variety, volume and sophistication of today’s bots wreak havoc across online operations big and small. They’re the key culprits behind web scraping, brute force attacks, competitive data mining, brownouts, account hijacking, unauthorized vulnerability scans, spam, man-inthe- middle attacks, and click fraud.

These are just averages. It’s much worse for some big players.

Bad bots made up 78% of Amazon’s 2014 traffic, not a huge difference from 2013. VerizonBusiness really cleaned up its act, cutting its bad bot traffic by 54% in 2014.

It was surprising to me that the US is the largest source for bad bot traffic.

The United States, with thousands of cheap hosts, dominates the rankings in bad bot origination. Taken in isolation, absolute bad bot volume data can be somewhat misleading. Measuring bad bots per online user yields acountry’s “Bad Bot GDP.”

Using this latter “bad bots per online user” statistic, the nations of Singapore, Israel, Slovenia and Maldives are the biggest culprits.

The report contains more great information for those who are interested in bots. Enjoy!

 

Big Day for Lindbergh and Earhart!

Aircraft
Author: Mark Dixon
Thursday, May 21, 2015
4:43 pm

Today is the anniversary of two great events in aviation history.  On May 21, 1927, Charles Lindbergh landed in Paris, successfully completing the first solo, nonstop flight across the Atlantic ocean.  Five years later, on May 21, 1932, Amelia Earhart became the first pilot to repeat the feat, landing her plane in Ireland after flying across the North Atlantic.

Congratulations to these brave pioneers of the air!

LindbergEarhart

Both Lindberg’s Spirit of St. Louis and Earhart’s Lockheed Vega airplanes are now housed in the Smithsonian Air and Space Museum in Washington, DC.

Spirit St Louis 590

Lockheed Vega 5b Smithsonian

 

 

Turing Test (Reversed)

Information Security
Author: Mark Dixon
Tuesday, May 19, 2015
3:13 pm

Turing1

The classic Turing Test, according to Wikipedia, is:

a test of a machine’s ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. Alan Turing proposed that a human evaluator would judge natural language conversations between a human and a machine that is designed to generate human-like responses. …

The test was introduced by Turing in his 1950 paper “Computing Machinery and Intelligence.” …

As illustrated in the first diagram:

The “standard interpretation” of the Turing Test, in which player C, the interrogator, is given the task of trying to determine which player – A or B – is a computer and which is a human. The interrogator is limited to using the responses to written questions to make the determination. …

In the years since 1950, the test has proven to be both highly influential and widely criticised, and it is an essential concept in the philosophy of artificial intelligence.

Turing2

What if the roles were reversed, and a computer was tasked with determining which of the entities on the other side of the wall was a human and which was a computer?  Such is the challenge for software that needs to decide which requests made to an online commerce system are generated by humans typing on a browser, and which are illicit bots imitating humans.

By one year-old estimate, ”more than 61 percent of all Web traffic is now generated by bots, a 21 percent increase over 2012.” Computers must automatically determine which requests come from people and which come from bots, as illustrated in the second diagram.

While this is not strictly a Turing test, it has some similar characteristics.  The computer below the line doesn’t know ahead of time what techniques the bots will use to imitate human interaction. These decisions need to be made in real time and be accurate enough to prevent illicit bots from penetrating the system. A number of companies offer products or services that accomplish this task.

One might ask, “Does this process of successfully choosing between human and bot constitute artificial intelligence?”

At the current state of the art, I think not, but it is area where enhanced computer intelligence could provide real value.

 

Security: Complexity and Simplicity

Information Security
Author: Mark Dixon
Monday, May 18, 2015
4:48 pm

Leobruce

It is quite well documented that Bruce Schneier stated that “Complexity is the worst enemy of security.

As a consumer, I think this complexity is great. There are more choices, more options, more things I can do. As a security professional, I think it’s terrifying. Complexity is the worst enemy of security.  (Crypto-Gram newsletter, March 15, 2000)

Leonardo da Vinci is widely credited with the the statement, “Simplicity is the ultimate sophistication,” although there is some doubt whether he actually said those words.

Both statements have strong implications for information security today.

In the March, 2000 newsletter, Bruce Schneier suggested five reasons why security challenges rise as complexity increases:

  1. Security bugs.  All software has bugs. As complexity rises, the number of bugs goes up.
  2. Modularity of complex systems.  Complex systems are necessarily modular; security often fails where modules interact.
  3. Increased testing requirements. The number of errors and difficulty of evaluation grown rapidly as complexity increases.
  4. Complex systems are difficult to understand. Understanding becomes more difficult as the number of components and system options increase.
  5. Security analysis is more difficult. Everything is more complicated – the specification, the design, the implementation, the use, etc.

In his February 2015 article, “Is Complexity the Downfall of IT Security,”  Jeff Clarke suggested some other reasons:

  1. More people involved. As a security solution becomes more complex, you’ll need more people to implement and maintain it. 
  2. More countermeasures. Firewalls, intrusion-detection systems, malware detectors and on and on. How do all these elements work together to protect a network without impairing its performance? 
  3. More attacks. Even if you secure your system against every known avenue of attack, tomorrow some enterprising hacker will find a new exploit. 
  4. More automation. Removing people from the loop can solve some problems, but like a redundancy-management system in the context of reliability, doing so adds another layer of complexity.

And of, course, we need to consider the enormous scale of this complexity.  Cisco has predicted that 50 billion devices will be connected to the Internet by 2020.  Every interconnection in that huge web of devices represents an attack surface.

How in the world can we cope? Perhaps we need to apply Leonardo’s simplicity principle.

I think Bruce Schneier’s advice provides a framework for simplification:

  1. Resilience. If nonlinear, tightly coupled complex systems are more dangerous and insecure, then the solution is to move toward more linear and loosely coupled systems. This might mean simplifying procedures or reducing dependencies or adding ways for a subsystem to fail gracefully without taking the rest of the system down with it.  A good example of a loosely coupled system is the air traffic control system. It’s very complex, but individual failures don’t cause catastrophic failures elsewhere. Even when a malicious insider deliberately took out an air traffic control tower in Chicago, all the planes landed safely. Yes, there were traffic disruptions, but they were isolated in both time and space.
  2. Prevention, Detection and Response. Security is a combination of prevention, detection, and response. All three are required, and none of them are perfect. As long as we recognize that — and build our systems with that in mind — we’ll be OK.This is no different from security in any other realm. A motivated, funded, and skilled burglar will always be able to get into your house. A motivated, funded, and skilled murderer will always be able to kill you. These are realities that we’ve lived with for thousands of years, and they’re not going to change soon. What is changing in IT security is response. We’re all going to have to get better about IT incident response because there will always be successful intrusions.

But a final thought from Bruce is very appropriate. “In security, the devil is in the details, and those details matter a lot.”

 

Just Another Day at the Office

Space Travel
Author: Mark Dixon
Friday, May 15, 2015
2:35 pm

Today’s featured photo from NASA show the Space Station’s crew in an ordinary day of work.

NASA150515

The six-member Expedition 43 crew worked a variety of onboard maintenance tasks, ensuring crew safety and the upkeep of the International Space Station’s hardware. In this image, NASA astronauts Scott Kelly (left) and Terry Virts (right) work on a Carbon Dioxide Removal Assembly (CDRA) inside the station’s Japanese Experiment Module.

For just a day or two, it would be so fun to work in weightless conditions.  Not too probable at this stage of my life, however!

 

 

Deep Blue Defeated Garry Kasparov

Artificial Intelligence
Author: Mark Dixon
Monday, May 11, 2015
7:14 am

Eighteen years ago today, on May 10, 1997, an IBM supercomputer named Deep Blue defeated chess champion Garry Kasparov in a six-game chess match, the first defeat of a reigning world chess champion to a computer under tournament conditions.

DeepBlue160pxKasparov160px

Did Deep Blue demonstrate real artificial intelligence? The opinions are mixed. I like the comments of Drew McDermott  Professor of Computer Science at Yale University:

So, what shall we say about Deep Blue? How about: It’s a “little bit” intelligent. It knows a tremendous amount about an incredibly narrow area. I have no doubt that Deep Blue’s computations differ in detail from a human grandmaster’s; but then, human grandmasters differ from each other in many ways. On the other hand, a log of Deep Blue’s computations is perfectly intelligible to chess masters; they speak the same language, as it were. That’s why the IBM team refused to give game logs to Kasparov during the match; it would be equivalent to bugging the hotel room where he discussed strategy with his seconds. Saying Deep Blue doesn’t really think about chess is like saying an airplane doesn’t really fly because it doesn’t flap its wings.

It will be fun to see what the future brings. In the mean time, I like this phrase, which I first saw on a cubicle of a customer in Tennessee, “Intelligence, even if artificial, is preferable to stupidity, no matter how genuine.”

 

Lockheed SR-71 Blackbird

Aircraft
Author: Mark Dixon
Sunday, May 10, 2015
10:24 am

The Lockheed SR-71 Blackbird has to be one of the coolest airplanes ever built. Fast, beautiful, mysterious … this plane is full of intrigue!

Sr71

The National Museum of the US Air Force states:

The SR-71, unofficially known as the “Blackbird,” is a long-range, advanced, strategic reconnaissance aircraft developed from the Lockheed A-12 and YF-12A aircraft. The first flight of an SR-71 took place on Dec. 22, 1964, and the first SR-71 to enter service was delivered to the 4200th (later 9th) Strategic Reconnaissance Wing at Beale Air Force Base, Calif., in January 1966. The U.S. Air Force retired its fleet of SR-71s on Jan. 26, 1990, because of a decreasing defense budget and high costs of operation. 

Throughout its nearly 24-year career, the SR-71 remained the world’s fastest and highest-flying operational aircraft. From 80,000 feet, it could survey 100,000 square miles of Earth’s surface per hour. On July 28, 1976, an SR-71 set two world records for its class — an absolute speed record of 2,193.167 mph and an absolute altitude record of 85,068.997 feet.

The closest I ever got to one of these beauties was at the Hill Aerospace Museum near Ogden, Utah. Quite a sight!

 

Do We Need a Mobile Strategy?

Identity, Mobile
Author: Mark Dixon
Friday, May 8, 2015
11:44 am

It is quite amazing to me how many customers I visit who are really struggling with how to handle mobile devices, data and applications securely.  This week, the following cartoon came across my desk. the funny thing to me is that the cartoon was published in 2011.  Here is is 2015 and we still struggle!

Marketoonist

 

Mcdonnell XF-85 Goblin

Aircraft
Author: Mark Dixon
Friday, May 8, 2015
9:39 am

I have long been fascinated with airplanes of all kinds. This post is the first of a series of photos of wacky and wonderful aircraft.

We start first with one of the coolest airplanes I have every seen, the Mcdonnell XF-85 Goblin. Only two were built and I saw one of them in the Wright Patterson Air Force Base museum back in the mid 1980’s.

From the Nation Museum of the Airforce site:

The McDonnell Aircraft Corp. developed the XF-85 Goblin “parasite” fighter to protect B-36 bombers flying beyond the range of conventional escort fighters. Planners envisioned a “parent” B-36 carrying the XF-85 in the bomb bay, and if enemy fighters attacked, the Goblin would have been lowered on a trapeze and released to combat the attackers. Once the enemy had been driven away, the Goblin would return to the B-36, hook onto the trapeze, fold its wings and be lifted back into the bomb bay. The Goblin had no landing gear, but it had a steel skid under the fuselage and small runners on the wingtips for emergency landings.

Pretty neat little airplane!

Xf85

 

We Passed!

Technology
Author: Mark Dixon
Thursday, May 7, 2015
3:10 pm

In order to register for an interesting online service this afternoon, I had to perform an Internet speed test.  It was nice to know that we (my computer, my internet connection and I) passed quite handily!

A lot of water has passed beneath the proverbial bridge since 300 baud acoustic coupler modems!

Internetspeed

 
 
 
 
 
Copyright © 2005-2013, Mark G. Dixon. All Rights Reserved.
Powered by WordPress.