Home / Bureaucracy / Minority Report’s Pre-Crime: It’s Almost Here. (Well, Not Quite)
Print Friendly and PDF

Minority Report’s Pre-Crime: It’s Almost Here. (Well, Not Quite)

Written by Gary North on May 9, 2012

The U.S. Army is making headway in identifying pre-crime behavior. Anyway, that’s what The Army Times thinks.

This was triggered by Wikileak’s release of documents collected by Bradley Manning. The Army wants to know what soldiers are doing on their computers. So, it is looking at their keystrokes, web searches, and downloads.

The Army wants to look at keystrokes, downloads and Web searches on computers that soldiers use. It is looking to buy software to do this. Well, it will soon start shopping.

This software must identify abnormal behavior by any of the Army’s army of 900,000 computer users.

Gee, you mean this isn’t sold on Amazon?

The major general in charge of computers described how his might work.

So I’m on the South American desk, doing intelligence work and all of a sudden I start going around to China, let’s say. That might be an anomaly, it might be justified, but I would sure like to know that and let someone make a decision, almost at the speed of thought.

The key words are these: “make a decision.” Who might this be? One thing is sure: he will have to be highly skilled, highly trained, and highly busy. Then, there is the question of rank. Will sergeants monitor captains?

There will have to be a chain of command. This is the U.S. Army we’re talking about.

Software of the type Smith describes is at various stages of development in the public and private sectors. Such software could spy on virtually any activity on a desktop depending on its programming, to detect when a soldier searches outside of his or her job description, downloads massive amounts of data from a shared hard drive or moves the data onto a removable drive.

The key words are these: “various stages of development.” This means they are not finished, let alone beta-tested.

The program could respond by recording the activity, alerting an administrator, shutting down the user’s access, or by feeding the person “dummy data” to watch what they do next, said Charles Beard, a cybersecurity executive with the defense firm SAIC’s intelligence, surveillance and reconnaissance group.

I suppose it could. There would be levels of screening, of course — rather like screening for viruses. There would therefore be shutdowns all day long. There will be paranoia on the keyboard. There will be frustration. There will be endless complains to set the “possible treason” slider on “low.” The “high” setting is keeping 900,000 people from getting any work done.

What’s exciting, Smith said, is the possibility of detecting problems as they happen, on what cybersecurity experts call “zero day,” as opposed to after the fact.

“We don’t want to be forensics experts. We want to catch it at the perimeter,” Smith said. “We want to catch this before it has a chance to be exploited.”

A forensic expert is someone who can assess evidence related to a past crime. He can determine its meaning in a context. The Army doesn’t want these sorts of people.

Then who does it want? Who can identify the crime before it takes place? Where does he get this training? It will have to be more rigorous than the training a forensics expert receives.

The Army is the beta-testing division. The plans ae much broader.

The Army’s efforts dovetail with a broader federal government initiative. President Obama signed an executive order last October that established an Insider Threat Task Force to develop a governmentwide program to deter, detect and mitigate insider threats.

Among other responsibilities, it would create policies for safeguarding classified information and networks, and for auditing and monitoring users.

In January, the White House’s Office of Management and Budget issued a memo directing government agencies that deal with classified information to ensure they adhere to security rules enacted after the WikiLeaks debacle.

Beyond technical solutions, the document asks agencies to create their own “insider threat program” to monitor employees for “behavioral changes” suggesting they might leak sensitive information.

I believe this is called locking the gate after the horse is out of the barn.

Or maybe it’s called setting fire to the barn.

The Insider Threat task force is scheduled to complete its work in October. Wow! One year: October to October. From a government committee, this is fast output indeed.

Next step: bring in the shrinks!

Deanna Caputo, lead behavioral psychologist for Mitre Corp., said both technical solutions and monitoring of human behaviors are needed for a successful detection and prevention program.

“To think that we can tackle the problem simply by technical solutions is a mistake,” Caputo said.

A “culture of reporting” is essential, she said. “We need to up the ante and expect a little bit more from our people” to report abnormal behaviors among their co-workers. However, “there is a fine line with that [reporting]. People need to trust they are in a safe environment to do their job.”

This impressed me.

Raytheon’s SureView software captures any security breach or policy violation it’s programmed to find and can “replay the event like a DVR,” for a local administrator or others to view, according to the company’s website. The software’s trigger is programmable and can be set to any behavior considered suspicious or not.

I can imagine the setting: (1) pornography downloading, (2) online gambling, (3) general snooping, (4) fooling around, (5) hacking for fun, (6) blocked promotion resentment, (7) getting even, (8) on the take, (9) mentally disturbed, (10) Maj. Hasan copycat (11) Antiwar.com mole.

At SAIC, which is testing a behavior analytics system, Beard likened behavioral modeling to the Pre-Crime unit from the science fiction movie “Minority Report.” Instead of using psychics to stop crimes before they occur, the software would be programmed to detect behavior that has preceded malicious acts in the past.

In real life, researchers are examining the behavior of malicious insiders to see what actions they took before they acted out. That in turn would be used to teach the software what behavior to flag.

The researchers had better be more alert that the people who were in charge of Hasam.

Cybersecurity expert Michael Tanji, an Army veteran who has spent nearly 20 years in the U.S. intelligence community, said he sees potential drawbacks and unanswered policy questions. He asked how the Army would implement such technology without unintentionally stifling cross-disciplinary collaboration among soldiers.

Knowing they are being monitored, personnel might avoid enterprising or creative behavior for fear it would be flagged by monitoring software, he said.

You think?

Tanji also predicted the technology would come at a considerable financial cost, both to warehouse the data collected by the software and to pay the added staff needed to monitor the reports it generates.

You think?

Reidy, the FBI official, said such concerns were valid. Because software may report benign behavior as malicious and vice versa, he cautioned against using technical solutions alone to solve insider threats.

“After a major incident, and no offense to any vendors, but the charlatanism always goes up,” he said. “It’s absolutely amazing how many phone calls I get from people who say they have solved the WikiLeaks problem or solved this or that problem. Everybody’s got to eat, but it’s simply not true.”

Finding bad behavior amid the vast sea of keystrokes, downloads and Web browsing on military computers is no easy task, DARPA acknowledges.

You think?

The program, based in behavioral science, would have to distinguish anomalous behavior from normal behavior, and deceptive and malicious behavior from anomalous behavior, the solicitation reads.

This may take a few years.

A solicitation for another program — Anomaly Detection at Multiple Scales, or ADAMS — uses accused Fort Hood shooter Maj. Nidal Hasan to frame the problem. It asks how to sift for anomalies through millions of data points — the emails and text messages on Fort Hood, for instance — using a unique algorithm, to rank threats and learn based on user feedback.

The software had better be better than the Army’s existing screening system. This was published in Time.

“We asked him pointedly, ‘Nidal, do you consider Shari’a law to transcend the Constitution of the United States?’ And he said, ‘Yes,’ ” a classmate told TIME on Monday. “We asked him if homicidal bombers were rewarded for their acts with 72 virgins in heaven and he responded, ‘I’ve done the research — yes.’ Those are comments he made in front of the class.” But such statements apparently didn’t trigger an inquiry.

Continue Reading on www.armytimes.com

Print Friendly and PDF

Posting Policy:
We have no tolerance for comments containing violence, racism, vulgarity, profanity, all caps, or discourteous behavior. Thank you for partnering with us to maintain a courteous and useful public environment where we can engage in reasonable discourse. Read more.

3 thoughts on “Minority Report’s Pre-Crime: It’s Almost Here. (Well, Not Quite)

  1. jheasler12175 says:

    I have a novel idea. When someone's duties are restricted to a particular subject or area, why not just restrict their computer access to anything else. Writing software for local controls might be much less expensive and intellectually less intrusive than monitoring keystrokes like an omnipresent looking over the shoulder policeman. The idea is to limit access to off-limits information, not catch people wandering on the internet whether for reasons of boredom or worse.

  2. In 1952 I read the book "1984"

    I may yet live to see it!!!

  3. This sounds like an intrusion into peoples lives and should not be done………