Announcement

Collapse
No announcement yet.

Snowden Used Cheap Web Crawler

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Snowden Used Cheap Web Crawler

    http://www.nytimes.com/2014/02/09/us...a.html?hp&_r=0

    Snowden Used Low-Cost Tool to Best N.S.A.

    By DAVID E. SANGER and ERIC SCHMITTFEB. 8, 2014

    WASHINGTON — Intelligence officials investigating how Edward J. Snowden gained access to a huge trove of the country’s most highly classified documents say they have determined that he used inexpensive and widely available software to “scrape” the National Security Agency’s networks, and kept at it even after he was briefly challenged by agency officials.

    Using “web crawler” software designed to search, index and back up a website, Mr. Snowden “scraped data out of our systems” while he went about his day job, according to a senior intelligence official. “We do not believe this was an individual sitting at a machine and downloading this much material in sequence,” the official said. The process, he added, was “quite automated.”

    The findings are striking because the N.S.A.’s mission includes protecting the nation’s most sensitive military and intelligence computer systems from cyberattacks, especially the sophisticated attacks that emanate from Russia and China. Mr. Snowden’s “insider attack,” by contrast, was hardly sophisticated and should have been easily detected, investigators found.
    Launch media viewer
    Officials say Mr. Snowden used “web crawler” software. Channel 4/Agence France-Presse — Getty Images

    Moreover, Mr. Snowden succeeded nearly three years after the WikiLeaks disclosures, in which military and State Department files, of far less sensitivity, were taken using similar techniques.

    Mr. Snowden had broad access to the N.S.A.’s complete files because he was working as a technology contractor for the agency in Hawaii, helping to manage the agency’s computer systems in an outpost that focuses on China and North Korea. A web crawler, also called a spider, automatically moves from website to website, following links embedded in each document, and can be programmed to copy everything in its path.

    Mr. Snowden appears to have set the parameters for the searches, including which subjects to look for and how deeply to follow links to documents and other data on the N.S.A.’s internal networks. Intelligence officials told a House hearing last week that he accessed roughly 1.7 million files.

    Among the materials prominent in the Snowden files are the agency’s shared “wikis,” databases to which intelligence analysts, operatives and others contributed their knowledge. Some of that material indicates that Mr. Snowden “accessed” the documents. But experts say they may well have been downloaded not by him but by the program acting on his behalf.

    Agency officials insist that if Mr. Snowden had been working from N.S.A. headquarters at Fort Meade, Md., which was equipped with monitors designed to detect when a huge volume of data was being accessed and downloaded, he almost certainly would have been caught. But because he worked at an agency outpost that had not yet been upgraded with modern security measures, his copying of what the agency’s newly appointed No. 2 officer, Rick Ledgett, recently called “the keys to the kingdom” raised few alarms.

    “Some place had to be last” in getting the security upgrade, said one official familiar with Mr. Snowden’s activities. But he added that Mr. Snowden’s actions had been “challenged a few times.”

    In at least one instance when he was questioned, Mr. Snowden provided what were later described to investigators as legitimate-sounding explanations for his activities: As a systems administrator he was responsible for conducting routine network maintenance. That could include backing up the computer systems and moving information to local servers, investigators were told.

    But from his first days working as a contractor inside the N.S.A.’s aging underground Oahu facility for Dell, the computer maker, and then at a modern office building on the island for Booz Allen Hamilton, the technology consulting firm that sells and operates computer security services used by the government, Mr. Snowden learned something critical about the N.S.A.’s culture: While the organization built enormously high electronic barriers to keep out foreign invaders, it had rudimentary protections against insiders.

    “Once you are inside the assumption is that you are supposed to be there, like in most organizations,” said Richard Bejtlich, the chief security strategist for FireEye, a Silicon Valley computer security firm, and a senior fellow at the Brookings Institution. “But that doesn’t explain why they weren’t more vigilant about excessive activity in the system.”

    Investigators have yet to answer the question of whether Mr. Snowden happened into an ill-defended outpost of the N.S.A. or sought a job there because he knew it had yet to install the security upgrades that might have stopped him.

    “He was either very lucky or very strategic,” one intelligence official said. A new book, “The Snowden Files,” by Luke Harding, a correspondent for The Guardian in London, reports that Mr. Snowden sought his job at Booz Allen because “to get access to a final tranche of documents” he needed “greater security privileges than he enjoyed in his position at Dell.”

    Through his lawyer at the American Civil Liberties Union, Mr. Snowden did not specifically address the government’s theory of how he obtained the files, saying in a statement: “It’s ironic that officials are giving classified information to journalists in an effort to discredit me for giving classified information to journalists. The difference is that I did so to inform the public about the government’s actions, and they’re doing so to misinform the public about mine.”
    Launch media viewer
    The headquarters of Booz Allen Hamilton, one of Edward J. Snowden’s former employers, in McLean, Va. He had broad access to National Security Agency files as a contractor in Hawaii. Michael Reynolds/European Pressphoto Agency

    The N.S.A. declined to comment on its investigation or the security changes it has made since the Snowden disclosures. Other intelligence officials familiar with the findings of the investigations underway — there are at least four — were granted anonymity to discuss the investigations.

    In interviews, officials declined to say which web crawler Mr. Snowden had used, or whether he had written some of the software himself. Officials said it functioned like Googlebot, a widely used web crawler that Google developed to find and index new pages on the web. What officials cannot explain is why the presence of such software in a highly classified system was not an obvious tip-off to unauthorized activity.

    When inserted with Mr. Snowden’s passwords, the web crawler became especially powerful. Investigators determined he probably had also made use of the passwords of some colleagues or supervisors.

    But he was also aided by a culture within the N.S.A., officials say, that “compartmented” relatively little information. As a result, a 29-year-old computer engineer, working from a World War II-era tunnel in Oahu and then from downtown Honolulu, had access to unencrypted files that dealt with information as varied as the bulk collection of domestic phone numbers and the intercepted communications of Chancellor Angela Merkel of Germany and dozens of other leaders.
    Recent Comments

    Officials say web crawlers are almost never used on the N.S.A.’s internal systems, making it all the more inexplicable that the one used by Mr. Snowden did not set off alarms as it copied intelligence and military documents stored in the N.S.A.’s systems and linked through the agency’s internal equivalent of Wikipedia.

    The answer, officials and outside experts say, is that no one was looking inside the system in Hawaii for hard-to-explain activity. “The N.S.A. had the solution to this problem in hand, but they simply didn’t push it out fast enough,” said James Lewis, a computer expert at the Center for Strategic and International Studies who has talked extensively with intelligence officials about how the Snowden experience could have been avoided.

    Nonetheless, the government had warning that it was vulnerable to such attacks. Similar techniques were used by Chelsea Manning, then known as Pfc. Bradley Manning, who was convicted of turning documents and videos over to WikiLeaks in 2010.

    Evidence presented during Private Manning’s court-martial for his role as the source for large archives of military and diplomatic files given to WikiLeaks revealed that he had used a program called “wget” to download the batches of files. That program automates the retrieval of large numbers of files, but it is considered less powerful than the tool Mr. Snowden used.

    The program’s use prompted changes in how secret information is handled at the State Department, the Pentagon and the intelligence agencies, but recent assessments suggest that those changes may not have gone far enough. For example, arguments have broken out about whether the N.S.A.’s data should all be encrypted “at rest” — when it is stored in servers — to make it harder to search and steal. But that would also make it harder to retrieve for legitimate purposes.

    Investigators have found no evidence that Mr. Snowden’s searches were directed by a foreign power, despite suggestions to that effect by the chairman of the House Intelligence Committee, Representative Mike Rogers, Republican of Michigan, in recent television appearances and at a hearing last week.

    But that leaves open the question of how Mr. Snowden chose the search terms to obtain his trove of documents, and why, according to James R. Clapper Jr., the director of national intelligence, they yielded a disproportionately large number of documents detailing American military movements, preparations and abilities around the world.

    In his statement, Mr. Snowden denied any deliberate effort to gain access to any military information. “They rely on a baseless premise, which is that I was after military information,” Mr. Snowden said.

    The head of the Defense Intelligence Agency, Lt. Gen. Michael T. Flynn, told lawmakers last week that Mr. Snowden’s disclosures could tip off adversaries to American military tactics and operations, and force the Pentagon to spend vast sums to safeguard against that. But he admitted a great deal of uncertainty about what Mr. Snowden possessed.

    “Everything that he touched, we assume that he took,” said General Flynn, including details of how the military tracks terrorists, of enemies’ vulnerabilities and of American defenses against improvised explosive devices. He added, “We assume the worst case.”
    To be Truly ignorant, Man requires an Education - Plato

  • #2
    It seems Snowden's accomplice was simplicity itself.
    sigpic

    Comment


    • #3
      Originally posted by Minskaya View Post
      It seems Snowden's accomplice was simplicity itself.
      So true. And now the government's adherence to principle, that he must be punished, insures that he will continue to possess and reveal more classified, possibly damaging material.

      At what point would the US consider it is in its best interest to make a deal with him?

      I don't think his revelations will have a lasting impact. Spy agencies will simply find better ways to secure their information.
      To be Truly ignorant, Man requires an Education - Plato

      Comment


      • #4
        Deal with Snowden?

        How is that even possible? They already look highly incompetent. Perception is everything.
        No such thing as a good tax - Churchill

        To make mistakes is human. To blame someone else for your mistake, is strategic.

        Comment


        • #5
          Originally posted by Doktor View Post
          Deal with Snowden?

          How is that even possible? They already look highly incompetent. Perception is everything.
          Good point. I was not advocating amnesty, but speculating on whether perception may have to take second place to national security. That depends on what he has on his computer.
          To be Truly ignorant, Man requires an Education - Plato

          Comment


          • #6
            I can't see this happening. At least not public. After all the media hoopla, that is.
            No such thing as a good tax - Churchill

            To make mistakes is human. To blame someone else for your mistake, is strategic.

            Comment


            • #7
              I don't think he is stupid enough to cut any deals….unless he is tired of living.
              Removing a single turd from the cesspool doesn't make any difference.

              Comment


              • #8
                Life out of bars, with non-prison meals and private showers?
                All those who are merciful with the cruel will come to be cruel to the merciful.
                -Talmud Kohelet Rabbah, 7:16.

                Comment


                • #9
                  Originally posted by Minskaya View Post
                  It seems Snowden's accomplice was simplicity itself.
                  Minnie, I am not so sure about the simplicity thing. Could be.

                  How did the ant evade the Admins there at NSA?

                  Every major website tracks web crawlers. How so, an organisation as big as the NSA and enormously funded did not. Someone was not doing his job properly or simply they did not anticipate it. If NSA did not anticipate it, that means they do not classify web crawlers as a threat. Sure they don't, as they won't put up classified information online. Which in turn means they weren't worried about someone getting through their servers from an outside location. However, shouldn't a web crawler roaming around in an internal network not raise a flag. I am not sure, I am trying to understand it here.

                  Depending on designation/hierarchy & sensitivity of work assignment, administrative privileges are limited. And Snowden being a contractual employee, how could he install the application? And since he did, it means IT security protocols in NSA can be flouted, more so in this case of web crawler which nobody probably considered a threat.

                  It might also be the case of security engines not having enough signatures to raise a red flag for a potential attack or simple spying. For e.g., commercial anti-virus companies keep on updating their database with virus signatures every time a new worm/ virus etc is discovered, the protection of which we get as updates when connected to the internet. OTOH, they # code that tackles, say a virus which was discovered in 2000 with the notion that the virus is no longer effective or doing the rounds of the internet in 2014. (It is done after extensive research though.) In short, the disease is extinct. This helps keep the anti-virus code small and it runs on less system resources. More moolah! So, the whole issue is dropping old signatures and introducing new ones.
                  Politicians are elected to serve...far too many don't see it that way - Albany Rifles! || Loyalty to country always. Loyalty to government, when it deserves it - Mark Twain! || I am a far left millennial!

                  Comment

                  Working...
                  X