[Log In] []

Exploring the science and magic of Identity and Access Management
Wednesday, April 24, 2024

May 1927 – Model T Production Ceases

Automotive
Author: Mark Dixon
Tuesday, May 26, 2015
11:16 am

On May 26,1927  Henry Ford and his son Edsel drove the final Model T out of the Ford factory. Completion of this 15 millionth Model T Ford marked the famous automobile’s official last day of production.

ModelT

 

The History.com article stated

More than any other vehicle, the relatively affordable and efficient Model T was responsible for accelerating the automobile’s introduction into American society during the first quarter of the 20th century. Introduced in October 1908, the Model T—also known as the “Tin Lizzie”—weighed some 1,200 pounds, with a 20-horsepower, four-cylinder engine. It got about 13 to 21 miles per gallon of gasoline and could travel up to 45 mph. Initially selling for around $850 (around $20,000 in today’s dollars), the Model T would later sell for as little as $260 (around $6,000 today) for the basic no-extras model. …

No car in history, had the impact—both actual and mythological—of the Model T: Authors like Ernest Hemingway, E.B. White and John Steinbeck featured the Tin Lizzie in their prose, while the great filmmaker Charlie Chaplin immortalized it in satire in his 1928 film “The Circus.”

I have never driven a Model T, but have always loved seeing those old cars in real life or in pictures, faithfully restored or heavily customized. Just for fun, here is a hot rod that originally was a Model T. My guess is that nothing but the “bucket” is original equipment, but who cares? Enjoy!

ModelT2

Comments Off on May 1927 – Model T Production Ceases . Permalink . Trackback URL
WordPress Tags: , ,
 

To the Moon and Back: We Can Do Hard Things

Leadership, Space Travel
Author: Mark Dixon
Tuesday, May 26, 2015
10:15 am

On May 25, 1961, President John F. Kennedy announced his goal of putting a man on the moon by the end of the decade.

Kennedy moon speech 1961

A brief excerpt of the speech:

I believe that this nation should commit itself to achieving the goal, before this decade is out, of landing a man on the moon and returning him safely to the Earth. No single space project in this period will be more impressive to mankind, or more important for the long-range exploration of space; and none will be so difficult or expensive to accomplish.

… in a very real sense, it will not be one man going to the moon–if we make this judgment affirmatively, it will be an entire nation. For all of us must work to put him there.

What a thrill it was of living through those years of incredible innovation, splendid courage and diligent work by so many people. As President Kennedy said, it was not just one man going to the moon, it was a nation united in effort to get that astronauts there and bring them back.

P.S.  I think the look on Lyndon Johnson’s face is priceless.  It is as if he were thinking, “What in the world has that guy been smoking? We’ll never do that!”

Comments Off on To the Moon and Back: We Can Do Hard Things . Permalink . Trackback URL
WordPress Tags: ,
 

Healthy Eating – Really?

Health
Author: Mark Dixon
Tuesday, May 26, 2015
10:04 am

Incorporating all the current health buzzwords in your diet doesn’t necessarily mean you are eating healthy:

Marketoonist

 

Tom Fishburne (aka Marketoonist) explains:

It’s a tricky time to be a food marketer. How consumers define what it means to be “healthy” is in flux. As a food marketing friend pointed out, consumers are increasingly prioritizing food purity over calorie count.

Chipotle is the poster brand for the current state of health positioning. They’re taking a leadership role in progressive stances like GMO-free and sustainable sourcing. And this obscures the fact that an average meal at Chipotle packs a whopping 1,070 calories, close to a full day’s worth of salt, and 75% of a day’s worth of saturated fat. A Chipotle burrito has more than double the calories, cholesterol, and grams of fat than a Taco Bell Supreme Beef Burrito.

It’s similar to soda makers that tout being “made with real cane sugar” or granola bars that are really glorified candy bars. There’s an aura of health that distracts from the actual nutritional picture. Researchers refer to this as a “health halo.”

Maybe the biscuits and gravy I ate for breakfast yesterday weren’t so bad after all!

Comments Off on Healthy Eating – Really? . Permalink . Trackback URL
WordPress Tags:
 

Bots Generate a Majority of Internet Traffic

Information Security
Author: Mark Dixon
Friday, May 22, 2015
11:16 am

Bot1

According to the 2015 Bad Bot Landscape report, published by Distil Networks, only 40% of Internet traffic is generated by humans! Good bots (e.g. Googlebot and Bingbot for search engines) account for 36% or traffic, while bad bots account for 23%.

Bad bots continue to place a huge tax on IT security and web infrastructure teams across the globe. The variety, volume and sophistication of today’s bots wreak havoc across online operations big and small. They’re the key culprits behind web scraping, brute force attacks, competitive data mining, brownouts, account hijacking, unauthorized vulnerability scans, spam, man-inthe- middle attacks, and click fraud.

These are just averages. It’s much worse for some big players.

Bad bots made up 78% of Amazon’s 2014 traffic, not a huge difference from 2013. VerizonBusiness really cleaned up its act, cutting its bad bot traffic by 54% in 2014.

It was surprising to me that the US is the largest source for bad bot traffic.

The United States, with thousands of cheap hosts, dominates the rankings in bad bot origination. Taken in isolation, absolute bad bot volume data can be somewhat misleading. Measuring bad bots per online user yields acountry’s “Bad Bot GDP.”

Using this latter “bad bots per online user” statistic, the nations of Singapore, Israel, Slovenia and Maldives are the biggest culprits.

The report contains more great information for those who are interested in bots. Enjoy!

Comments Off on Bots Generate a Majority of Internet Traffic . Permalink . Trackback URL
WordPress Tags:
 

Big Day for Lindbergh and Earhart!

Aircraft
Author: Mark Dixon
Thursday, May 21, 2015
4:43 pm

Today is the anniversary of two great events in aviation history.  On May 21, 1927, Charles Lindbergh landed in Paris, successfully completing the first solo, nonstop flight across the Atlantic ocean.  Five years later, on May 21, 1932, Amelia Earhart became the first pilot to repeat the feat, landing her plane in Ireland after flying across the North Atlantic.

Congratulations to these brave pioneers of the air!

LindbergEarhart

Both Lindberg’s Spirit of St. Louis and Earhart’s Lockheed Vega airplanes are now housed in the Smithsonian Air and Space Museum in Washington, DC.

Spirit St Louis 590

Lockheed Vega 5b Smithsonian

 

Comments Off on Big Day for Lindbergh and Earhart! . Permalink . Trackback URL
 

Turing Test (Reversed)

Information Security
Author: Mark Dixon
Tuesday, May 19, 2015
3:13 pm

Turing1

The classic Turing Test, according to Wikipedia, is:

a test of a machine’s ability to exhibit intelligent behavior equivalent to, or indistinguishable from, that of a human. Alan Turing proposed that a human evaluator would judge natural language conversations between a human and a machine that is designed to generate human-like responses. …

The test was introduced by Turing in his 1950 paper “Computing Machinery and Intelligence.” …

As illustrated in the first diagram:

The “standard interpretation” of the Turing Test, in which player C, the interrogator, is given the task of trying to determine which player – A or B – is a computer and which is a human. The interrogator is limited to using the responses to written questions to make the determination. …

In the years since 1950, the test has proven to be both highly influential and widely criticised, and it is an essential concept in the philosophy of artificial intelligence.

Turing2

What if the roles were reversed, and a computer was tasked with determining which of the entities on the other side of the wall was a human and which was a computer?  Such is the challenge for software that needs to decide which requests made to an online commerce system are generated by humans typing on a browser, and which are illicit bots imitating humans.

By one year-old estimate, “more than 61 percent of all Web traffic is now generated by bots, a 21 percent increase over 2012.” Computers must automatically determine which requests come from people and which come from bots, as illustrated in the second diagram.

While this is not strictly a Turing test, it has some similar characteristics.  The computer below the line doesn’t know ahead of time what techniques the bots will use to imitate human interaction. These decisions need to be made in real time and be accurate enough to prevent illicit bots from penetrating the system. A number of companies offer products or services that accomplish this task.

One might ask, “Does this process of successfully choosing between human and bot constitute artificial intelligence?”

At the current state of the art, I think not, but it is area where enhanced computer intelligence could provide real value.

Comments Off on Turing Test (Reversed) . Permalink . Trackback URL
 

Security: Complexity and Simplicity

Information Security
Author: Mark Dixon
Monday, May 18, 2015
4:48 pm

Leobruce

It is quite well documented that Bruce Schneier stated that “Complexity is the worst enemy of security.

As a consumer, I think this complexity is great. There are more choices, more options, more things I can do. As a security professional, I think it’s terrifying. Complexity is the worst enemy of security.  (Crypto-Gram newsletter, March 15, 2000)

Leonardo da Vinci is widely credited with the the statement, “Simplicity is the ultimate sophistication,” although there is some doubt whether he actually said those words.

Both statements have strong implications for information security today.

In the March, 2000 newsletter, Bruce Schneier suggested five reasons why security challenges rise as complexity increases:

  1. Security bugs.  All software has bugs. As complexity rises, the number of bugs goes up.
  2. Modularity of complex systems.  Complex systems are necessarily modular; security often fails where modules interact.
  3. Increased testing requirements. The number of errors and difficulty of evaluation grown rapidly as complexity increases.
  4. Complex systems are difficult to understand. Understanding becomes more difficult as the number of components and system options increase.
  5. Security analysis is more difficult. Everything is more complicated – the specification, the design, the implementation, the use, etc.

In his February 2015 article, “Is Complexity the Downfall of IT Security,”  Jeff Clarke suggested some other reasons:

  1. More people involved. As a security solution becomes more complex, you’ll need more people to implement and maintain it. 
  2. More countermeasures. Firewalls, intrusion-detection systems, malware detectors and on and on. How do all these elements work together to protect a network without impairing its performance? 
  3. More attacks. Even if you secure your system against every known avenue of attack, tomorrow some enterprising hacker will find a new exploit. 
  4. More automation. Removing people from the loop can solve some problems, but like a redundancy-management system in the context of reliability, doing so adds another layer of complexity.

And of, course, we need to consider the enormous scale of this complexity.  Cisco has predicted that 50 billion devices will be connected to the Internet by 2020.  Every interconnection in that huge web of devices represents an attack surface.

How in the world can we cope? Perhaps we need to apply Leonardo’s simplicity principle.

I think Bruce Schneier’s advice provides a framework for simplification:

  1. Resilience. If nonlinear, tightly coupled complex systems are more dangerous and insecure, then the solution is to move toward more linear and loosely coupled systems. This might mean simplifying procedures or reducing dependencies or adding ways for a subsystem to fail gracefully without taking the rest of the system down with it.  A good example of a loosely coupled system is the air traffic control system. It’s very complex, but individual failures don’t cause catastrophic failures elsewhere. Even when a malicious insider deliberately took out an air traffic control tower in Chicago, all the planes landed safely. Yes, there were traffic disruptions, but they were isolated in both time and space.
  2. Prevention, Detection and Response. Security is a combination of prevention, detection, and response. All three are required, and none of them are perfect. As long as we recognize that — and build our systems with that in mind — we’ll be OK.This is no different from security in any other realm. A motivated, funded, and skilled burglar will always be able to get into your house. A motivated, funded, and skilled murderer will always be able to kill you. These are realities that we’ve lived with for thousands of years, and they’re not going to change soon. What is changing in IT security is response. We’re all going to have to get better about IT incident response because there will always be successful intrusions.

But a final thought from Bruce is very appropriate. “In security, the devil is in the details, and those details matter a lot.”

Comments Off on Security: Complexity and Simplicity . Permalink . Trackback URL
 

Just Another Day at the Office

Space Travel
Author: Mark Dixon
Friday, May 15, 2015
2:35 pm

Today’s featured photo from NASA show the Space Station’s crew in an ordinary day of work.

NASA150515

The six-member Expedition 43 crew worked a variety of onboard maintenance tasks, ensuring crew safety and the upkeep of the International Space Station’s hardware. In this image, NASA astronauts Scott Kelly (left) and Terry Virts (right) work on a Carbon Dioxide Removal Assembly (CDRA) inside the station’s Japanese Experiment Module.

For just a day or two, it would be so fun to work in weightless conditions.  Not too probable at this stage of my life, however!

 

Comments Off on Just Another Day at the Office . Permalink . Trackback URL
WordPress Tags:
 

Deep Blue Defeated Garry Kasparov

Artificial Intelligence
Author: Mark Dixon
Monday, May 11, 2015
7:14 am

Eighteen years ago today, on May 10, 1997, an IBM supercomputer named Deep Blue defeated chess champion Garry Kasparov in a six-game chess match, the first defeat of a reigning world chess champion to a computer under tournament conditions.

DeepBlue160pxKasparov160px

Did Deep Blue demonstrate real artificial intelligence? The opinions are mixed. I like the comments of Drew McDermott  Professor of Computer Science at Yale University:

So, what shall we say about Deep Blue? How about: It’s a “little bit” intelligent. It knows a tremendous amount about an incredibly narrow area. I have no doubt that Deep Blue’s computations differ in detail from a human grandmaster’s; but then, human grandmasters differ from each other in many ways. On the other hand, a log of Deep Blue’s computations is perfectly intelligible to chess masters; they speak the same language, as it were. That’s why the IBM team refused to give game logs to Kasparov during the match; it would be equivalent to bugging the hotel room where he discussed strategy with his seconds. Saying Deep Blue doesn’t really think about chess is like saying an airplane doesn’t really fly because it doesn’t flap its wings.

It will be fun to see what the future brings. In the mean time, I like this phrase, which I first saw on a cubicle of a customer in Tennessee, “Intelligence, even if artificial, is preferable to stupidity, no matter how genuine.”

Comments Off on Deep Blue Defeated Garry Kasparov . Permalink . Trackback URL
WordPress Tags: , ,
 

Lockheed SR-71 Blackbird

Aircraft
Author: Mark Dixon
Sunday, May 10, 2015
10:24 am

The Lockheed SR-71 Blackbird has to be one of the coolest airplanes ever built. Fast, beautiful, mysterious … this plane is full of intrigue!

Sr71

The National Museum of the US Air Force states:

The SR-71, unofficially known as the “Blackbird,” is a long-range, advanced, strategic reconnaissance aircraft developed from the Lockheed A-12 and YF-12A aircraft. The first flight of an SR-71 took place on Dec. 22, 1964, and the first SR-71 to enter service was delivered to the 4200th (later 9th) Strategic Reconnaissance Wing at Beale Air Force Base, Calif., in January 1966. The U.S. Air Force retired its fleet of SR-71s on Jan. 26, 1990, because of a decreasing defense budget and high costs of operation. 

Throughout its nearly 24-year career, the SR-71 remained the world’s fastest and highest-flying operational aircraft. From 80,000 feet, it could survey 100,000 square miles of Earth’s surface per hour. On July 28, 1976, an SR-71 set two world records for its class — an absolute speed record of 2,193.167 mph and an absolute altitude record of 85,068.997 feet.

The closest I ever got to one of these beauties was at the Hill Aerospace Museum near Ogden, Utah. Quite a sight!

Comments Off on Lockheed SR-71 Blackbird . Permalink . Trackback URL
WordPress Tags:
 
Copyright © 2005-2016, Mark G. Dixon. All Rights Reserved.
Powered by WordPress.