Shared posts

18 Oct 08:14

Air Gaps

by schneier

Since I started working with Snowden's documents, I have been using a number of tools to try to stay secure from the NSA. The advice I shared included using Tor, preferring certain cryptography over others, and using public-domain encryption wherever possible.

I also recommended using an air gap, which physically isolates a computer or local network of computers from the Internet. (The name comes from the literal gap of air between the computer and the Internet; the word predates wireless networks.)

But this is more complicated than it sounds, and requires explanation.

Since we know that computers connected to the Internet are vulnerable to outside hacking, an air gap should protect against those attacks. There are a lot of systems that use -- or should use -- air gaps: classified military networks, nuclear power plant controls, medical equipment, avionics, and so on.

Osama Bin Laden used one. I hope human rights organizations in repressive countries are doing the same.

Air gaps might be conceptually simple, but they're hard to maintain in practice. The truth is that nobody wants a computer that never receives files from the Internet and never sends files out into the Internet. What they want is a computer that's not directly connected to the Internet, albeit with some secure way of moving files on and off.

But every time a file moves back or forth, there's the potential for attack.

And air gaps have been breached. Stuxnet was a US and Israeli military-grade piece of malware that attacked the Natanz nuclear plant in Iran. It successfully jumped the air gap and penetrated the Natanz network. Another piece of malware named agent.btz, probably Chinese in origin, successfully jumped the air gap protecting US military networks.

These attacks work by exploiting security vulnerabilities in the removable media used to transfer files on and off the air-gapped computers.

Since working with Snowden's NSA files, I have tried to maintain a single air-gapped computer. It turned out to be harder than I expected, and I have ten rules for anyone trying to do the same:

1. When you set up your computer, connect it to the Internet as little as possible. It's impossible to completely avoid connecting the computer to the Internet, but try to configure it all at once and as anonymously as possible. I purchased my computer off-the-shelf in a big box store, then went to a friend's network and downloaded everything I needed in a single session. (The ultra-paranoid way to do this is to buy two identical computers, configure one using the above method, upload the results to a cloud-based anti-virus checker, and transfer the results of that to the air gap machine using a one-way process.)

2. Install the minimum software set you need to do your job, and disable all operating system services that you won't need. The less software you install, the less an attacker has available to exploit. I downloaded and installed OpenOffice, a PDF reader, a text editor, TrueCrypt, and BleachBit. That's all. (No, I don't have any inside knowledge about TrueCrypt, and there's a lot about it that makes me suspicious. But for Windows full-disk encryption it's that, Microsoft's BitLocker, or Symantec's PGPDisk -- and I am more worried about large US corporations being pressured by the NSA than I am about TrueCrypt.)

3. Once you have your computer configured, never directly connect it to the Internet again. Consider physically disabling the wireless capability, so it doesn't get turned on by accident.

4. If you need to install new software, download it anonymously from a random network, put it on some removable media, and then manually transfer it to the air-gapped computer. This is by no means perfect, but it's an attempt to make it harder for the attacker to target your computer.

5. Turn off all autorun features. This should be standard practice for all the computers you own, but it's especially important for an air-gapped computer. Agent.btz used autorun to infect US military computers.

6. Minimize the amount of executable code you move onto the air-gapped computer. Text files are best. Microsoft Office files and PDFs are more dangerous, since they might have embedded macros. Turn off all macro capabilities you can on the air-gapped computer. Don't worry too much about patching your system; in general, the risk of the executable code is worse than the risk of not having your patches up to date. You're not on the Internet, after all.

7. Only use trusted media to move files on and off air-gapped computers. A USB stick you purchase from a store is safer than one given to you by someone you don't know -- or one you find in a parking lot.

8. For file transfer, a writable optical disk (CD or DVD) is safer than a USB stick. Malware can silently write data to a USB stick, but it can't spin the CD-R up to 1000 rpm without your noticing. This means that the malware can only write to the disk when you write to the disk. You can also verify how much data has been written to the CD by physically checking the back of it. If you've only written one file, but it looks like three-quarters of the CD was burned, you have a problem. Note: the first company to market a USB stick with a light that indicates a write operation -- not read or write; I've got one of those -- wins a prize.

9. When moving files on and off your air-gapped computer, use the absolute smallest storage device you can. And fill up the entire device with random files. If an air-gapped computer is compromised, the malware is going to try to sneak data off it using that media. While malware can easily hide stolen files from you, it can't break the laws of physics. So if you use a tiny transfer device, it can only steal a very small amount of data at a time. If you use a large device, it can take that much more. Business-card-sized mini-CDs can have capacity as low as 30 MB. I still see 1-GB USB sticks for sale.

10. Consider encrypting everything you move on and off the air-gapped computer. Sometimes you'll be moving public files and it won't matter, but sometimes you won't be, and it will. And if you're using optical media, those disks will be impossible to erase. Strong encryption solves these problems. And don't forget to encrypt the computer as well; whole-disk encryption is the best.

One thing I didn't do, although it's worth considering, is use a stateless operating system like Tails. You can configure Tails with a persistent volume to save your data, but no operating system changes are ever saved. Booting Tails from a read-only DVD -- you can keep your data on an encrypted USB stick -- is even more secure. Of course, this is not foolproof, but it greatly reduces the potential avenues for attack.

Yes, all this is advice for the paranoid. And it's probably impossible to enforce for any network more complicated than a single computer with a single user. But if you're thinking about setting up an air-gapped computer, you already believe that some very powerful attackers are after you personally. If you're going to use an air gap, use it properly.

Of course you can take things further. I have met people who have physically removed the camera, microphone, and wireless capability altogether. But that's too much paranoia for me right now.

This essay previously appeared on Wired.com.

EDITED TO ADD: Yes, I am ignoring TEMPEST attacks. I am also ignoring black bag attacks against my home.

11 Aug 08:01

Unity Workshops

by eriksvedang

I’m giving three one-day workshops on how to create interactive worlds in Unity, 16 – 18 August. They will be held in Gothenburg at NTI Gymnasiet. More information can be found at the website for Kulturkalaset here. They will be free to attend so please tell anyone who lives nearby and who might be interested.

Take care,
Erik


Filed under: Teaching
27 Jun 07:17

WALL-O-TRON, the interactive rock climbing wall

by Brian Benchoff

wall

Last April, hackerspaces around the country received a gift from RedBull for their creation challenge. The hackerspace teams were charged with creating, ‘something with LEDs’ and let loose in their workshop for a chance to win a trip to NYC and build some cool stuff. Of course, RedBull couldn’t bring all the teams to the big apple and a few incredible projects were left by the wayside in their home hackerspace.

One such project was the WALL-O-TRON from Team Rabbit-Hole and home base for the Tymkrs. It’s a huge wall embedded with LEDs that turn an ordinary rock climbing wall into a game called WallSweeper – climb a path to the illuminated hold, but don’t touch the ‘hot rock’ or your game is over.

The hand holds are illuminated by over 300 of LEDs connected to a Linux PC. The sign above the wall is controlled by RedBull’s TurBULL Encabulator, and the giant ‘WALL-O-TRON’ letters are huge pieces of foam with five meters of RGB LEDs embedded inside.

A great project with the possibility of being upgraded in the future with more games. Perfect for the rock climbing playground it’s situated in.


Filed under: led hacks
05 Jun 05:59

The Problems with CALEA-II

by schneier

The FBI wants a new law that will make it easier to wiretap the Internet. Although its claim is that the new law will only maintain the status quo, it's really much worse than that. This law will result in less-secure Internet products and create a foreign industry in more-secure alternatives. It will impose costly burdens on affected companies. It will assist totalitarian governments in spying on their own citizens. And it won't do much to hinder actual criminals and terrorists.

As the FBI sees it, the problem is that people are moving away from traditional communication systems like telephones onto computer systems like Skype. Eavesdropping on telephones used to be easy. The FBI would call the phone company, which would bring agents into a switching room and allow them to literally tap the wires with a pair of alligator clips and a tape recorder. In the 1990s, the government forced phone companies to provide an analogous capability on digital switches; but today, more and more communications happens over the Internet.

What the FBI wants is the ability to eavesdrop on everything. Depending on the system, this ranges from easy to impossible. E-mail systems like Gmail are easy. The mail resides in Google's servers, and the company has an office full of people who respond to requests for lawful access to individual accounts from governments all over the world. Encrypted voice systems like Silent Circle are impossible to eavesdrop on—the calls are encrypted from one computer to the other, and there's no central node to eavesdrop from. In those cases, the only way to make the system eavesdroppable is to add a backdoor to the user software. This is precisely the FBI's proposal. Companies that refuse to comply would be fined $25,000 a day.

The FBI believes it can have it both ways: that it can open systems to its eavesdropping, but keep them secure from anyone else's eavesdropping. That's just not possible. It's impossible to build a communications system that allows the FBI surreptitious access but doesn't allow similar access by others. When it comes to security, we have two options: We can build our systems to be as secure as possible from eavesdropping, or we can deliberately weaken their security. We have to choose one or the other.

This is an old debate, and one we've been through many times. The NSA even has a name for it: the equities issue. In the 1980s, the equities debate was about export control of cryptography. The government deliberately weakened U.S. cryptography products because it didn't want foreign groups to have access to secure systems. Two things resulted: fewer Internet products with cryptography, to the insecurity of everybody, and a vibrant foreign security industry based on the unofficial slogan "Don't buy the U.S. stuff -- it's lousy."

In 1993, the debate was about the Clipper Chip. This was another deliberately weakened security product, an encrypted telephone. The FBI convinced AT&T to add a backdoor that allowed for surreptitious wiretapping. The product was a complete failure. Again, why would anyone buy a deliberately weakened security system?

In 1994, the Communications Assistance for Law Enforcement Act mandated that U.S. companies build eavesdropping capabilities into phone switches. These were sold internationally; some countries liked having the ability to spy on their citizens. Of course, so did criminals, and there were public scandals in Greece (2005) and Italy (2006) as a result.

In 2012, we learned that every phone switch sold to the Department of Defense had security vulnerabilities in its surveillance system. And just this May, we learned that Chinese hackers breached Google's system for providing surveillance data for the FBI.

The new FBI proposal will fail in all these ways and more. The bad guys will be able to get around the eavesdropping capability, either by building their own security systems -- not very difficult -- or buying the more-secure foreign products that will inevitably be made available. Most of the good guys, who don't understand the risks or the technology, will not know enough to bother and will be less secure. The eavesdropping functions will 1) result in more obscure -- and less secure -- product designs, and 2) be vulnerable to exploitation by criminals, spies, and everyone else. U.S. companies will be forced to compete at a disadvantage; smart customers won't buy the substandard stuff when there are more-secure foreign alternatives. Even worse, there are lots of foreign governments who want to use these sorts of systems to spy on their own citizens. Do we really want to be exporting surveillance technology to the likes of China, Syria, and Saudi Arabia?

The FBI's short-sighted agenda also works against the parts of the government that are still working to secure the Internet for everyone. Initiatives within the NSA, the DOD, and DHS to do everything from securing computer operating systems to enabling anonymous web browsing will all be harmed by this.

What to do, then? The FBI claims that the Internet is "going dark," and that it's simply trying to maintain the status quo of being able to eavesdrop. This characterization is disingenuous at best. We are entering a golden age of surveillance; there's more electronic communications available for eavesdropping than ever before, including whole new classes of information: location tracking, financial tracking, and vast databases of historical communications such as e-mails and text messages. The FBI's surveillance department has it better than ever. With regard to voice communications, yes, software phone calls will be harder to eavesdrop upon. (Although there are questions about Skype's security.) That's just part of the evolution of technology, and one that on balance is a positive thing.

Think of it this way: We don't hand the government copies of our house keys and safe combinations. If agents want access, they get a warrant and then pick the locks or bust open the doors, just as a criminal would do. A similar system would work on computers. The FBI, with its increasingly non-transparent procedures and systems, has failed to make the case that this isn't good enough.

Finally there's a general principle at work that's worth explicitly stating. All tools can be used by the good guys and the bad guys. Cars have enormous societal value, even though bank robbers can use them as getaway cars. Cash is no different. Both good guys and bad guys send e-mails, use Skype, and eat at all-night restaurants. But because society consists overwhelmingly of good guys, the good uses of these dual-use technologies greatly outweigh the bad uses. Strong Internet security makes us all safer, even though it helps the bad guys as well. And it makes no sense to harm all of us in an attempt to harm a small subset of us.

This essay originally appeared in Foreign Policy.

04 Jun 05:34

Comic for June 4, 2013

Bofh69

I'll remember that for the next document I'll write that no one but the reviewers will read.

12 May 17:13

The truth about sports people

29 Apr 05:19

29. April, 2013

27 Mar 14:09

Our Internet Surveillance State

by schneier

I'm going to start with three data points.

One: Some of the Chinese military hackers who were implicated in a broad set of attacks against the U.S. government and corporations were identified because they accessed Facebook from the same network infrastructure they used to carry out their attacks.

Two: Hector Monsegur, one of the leaders of the LulzSac hacker movement, was identified and arrested last year by the FBI. Although he practiced good computer security and used an anonymous relay service to protect his identity, he slipped up.

And three: Paula Broadwell, who had an affair with CIA director David Petraeus, similarly took extensive precautions to hide her identity. She never logged in to her anonymous e-mail service from her home network. Instead, she used hotel and other public networks when she e-mailed him. The FBI correlated hotel registration data from several different hotels -- and hers was the common name.

The Internet is a surveillance state. Whether we admit it to ourselves or not, and whether we like it or not, we're being tracked all the time. Google tracks us, both on its pages and on other pages it has access to. Facebook does the same; it even tracks non-Facebook users. Apple tracks us on our iPhones and iPads. One reporter used a tool called Collusion to track who was tracking him; 105 companies tracked his Internet use during one 36-hour period.

Increasingly, what we do on the Internet is being combined with other data about us. Unmasking Broadwell's identity involved correlating her Internet activity with her hotel stays. Everything we do now involves computers, and computers produce data as a natural by-product. Everything is now being saved and correlated, and many big-data companies make money by building up intimate profiles of our lives from a variety of sources.

Facebook, for example, correlates your online behavior with your purchasing habits offline. And there's more. There's location data from your cell phone, there's a record of your movements from closed-circuit TVs.

This is ubiquitous surveillance: All of us being watched, all the time, and that data being stored forever. This is what a surveillance state looks like, and it's efficient beyond the wildest dreams of George Orwell.

Sure, we can take measures to prevent this. We can limit what we search on Google from our iPhones, and instead use computer web browsers that allow us to delete cookies. We can use an alias on Facebook. We can turn our cell phones off and spend cash. But increasingly, none of it matters.

There are simply too many ways to be tracked. The Internet, e-mail, cell phones, web browsers, social networking sites, search engines: these have become necessities, and it's fanciful to expect people to simply refuse to use them just because they don't like the spying, especially since the full extent of such spying is deliberately hidden from us and there are few alternatives being marketed by companies that don't spy.

This isn't something the free market can fix. We consumers have no choice in the matter. All the major companies that provide us with Internet services are interested in tracking us. Visit a website and it will almost certainly know who you are; there are lots of ways to be tracked without cookies. Cell phone companies routinely undo the web's privacy protection. One experiment at Carnegie Mellon took real-time videos of students on campus and was able to identify one-third of them by comparing their photos with publicly available tagged Facebook photos.

Maintaining privacy on the Internet is nearly impossible. If you forget even once to enable your protections, or click on the wrong link, or type the wrong thing, and you've permanently attached your name to whatever anonymous service you're using. Monsegur slipped up once, and the FBI got him. If the director of the CIA can't maintain his privacy on the Internet, we've got no hope.

In today's world, governments and corporations are working together to keep things that way. Governments are happy to use the data corporations collect -- occasionally demanding that they collect more and save it longer -- to spy on us. And corporations are happy to buy data from governments. Together the powerful spy on the powerless, and they're not going to give up their positions of power, despite what the people want.

Fixing this requires strong government will, but they're just as punch-drunk on data as the corporations. Slap-on-the-wrist fines notwithstanding, no one is agitating for better privacy laws.

So, we're done. Welcome to a world where Google knows exactly what sort of porn you all like, and more about your interests than your spouse does. Welcome to a world where your cell phone company knows exactly where you are all the time. Welcome to the end of private conversations, because increasingly your conversations are conducted by e-mail, text, or social networking sites.

And welcome to a world where all of this, and everything else that you do or is done on a computer, is saved, correlated, studied, passed around from company to company without your knowledge or consent; and where the government accesses it at will without a warrant.

Welcome to an Internet without privacy, and we've ended up here with hardly a fight.

This essay previously appeared on CNN.com, where it got 23,000 Facebook likes and 2,500 tweets -- by far the most widely distributed essay I've ever written.

Commentary.