Archive for the ‘privacy’ Category

Facebook’s new tag suggestion feature works by using facial recognition technology to evaluate photos in which you’ve already been tagged and then suggests your name when friends upload a photo that looks like you.

Like most new Facebook features, this is turned on by default, once again proving that Facebook just doesn’t get it about privacy. If you would prefer not to have Facebook store your “photo comparison information”, you need to opt out manually. The Electronic Freedom Foundation published a great video showing three ways to delete your “facial fingerprint” from Facebook.

The short version is:
Account/Privacy Settings/Customize Settings/Suggest photos of me to friends/Disable
followed by
Help Center/Photo tagging/How can I remove the summary information stored about me for tag suggestions? and click “contact us”

It’s a short video but well worth watching.

Score one for the Constitution! The US 6th Circuit just announced a decision upholding the requirement that police obtain a warrant before compelling an ISP to turn over your emails.

The background is that Steven Warshak was accused and eventually convicted of attempting to defraud the customers of Berkeley Premium Nutraceuticals (the distributor of Enzyte, an herbal supplement with some really goofy but apparently amazingly successful late night ads). The government agents in this case believed that they did not need a warrant because of some ambiguous provisions of the Stored Communications Act. (SCA was written in 1986 and had the unfortunate effect of codifying technology as it existed then. SCA has not held up well to the test of time.)

A number of privacy groups including EFF weighed in on the topic, successfully arguing that email users have a Fourth Amendment-protected expectation of privacy in the email they store with their email providers just like they do with traditional forms of communication like postal mail and telephone calls.

A warrant is easy to get and it’s unfortunate that the police in this case didn’t take the few extra minutes to document their probable cause. But the requirement for a warrant is an important check and balance on prosecution powers. The 6th Circuit did the right thing in finding that the Fourth Amendment applies to email, too. (They also did the right thing by narrowly ruling that this decision only overturns part of the matter. Warshak used some pretty sleazy practices and deserved to be put out of business.)

Next steps: It’s time for Congress to update the SCA.

Go to your Facebook page and take a screen-shot. Paste that into a Word document or Paint program. Now cover up the names and pictures and project the result up on the wall. What does it say about you? Would your friends recognize you? Your parents? Yourself?

Howard Rheingold, a social-media professor at Stanford University, runs this experiment with his class. It’s surprising – and a bit frightening – what you see about yourself in this way. As one of his students put it, Facebook tacitly encourages you to describe yourself in headlines. Snippets, soundbites and stereotypes. You list a specific interest but since readers only see the subset of things you list, they make assumptions based on that first impression. Many people who take a neutral look at their profile discover that it presents a very shallow image.

Worse, they find that it rarely presents an image of responsibility and trustworthiness. When so many employers include Facebook in their background checks, it’s an image that can really limit your options later on.

Facebook does have some privacy settings that can minimize the damage but only if you take the time to set it right and even then if you’re lucky enough to set them right now for the privacy settings you’ll need in 5 or 10 years. The better answer is to control what you post and what you allow others to post about you. If there’s something embarrassing, take it down.

The other thing to remember is that Facebook will probably not be the last word in social media. New programs will come out and hopefully they’ll take a stronger approach to privacy and foresight. In the meantime, be cautious about what you post in any social media. Be a little paranoid. Watch out for yourself.

[W]e’re in favor of strong encryption, robust encryption. The country needs it, industry needs it. We just want to make sure we have a trap door and key under some judge’s authority where we can get there if somebody is planning a crime.
– FBI Director Louis Freeh, May 11, 1995

They can promise strong encryption. They just need to figure out how they can provide us plain text.
– FBI General Counsel Valerie Caproni, September 27, 2010

Encryption backdoors were declared dead in 2001. Unfortunately, the proposal has raised it’s ugly head again. EFF published a reminder about why it was a bad idea then and is still a bad idea now. It’s important enough to quote in it’s entirety. With elections coming, please vote to protect your privacy rights.


For those who weren’t following digital civil liberties issues in 1995, or for those who have forgotten, here’s a refresher list of why forcing companies to break their own privacy and security measures by installing a back door was a bad idea 15 years ago. We’ll be posting more analysis when more details on the “new” proposal emerge, but this list is a start:

  1. It will create security risks. Don’t take our word for it. Computer security expert Steven Bellovin has explained some of the problems. First, it’s hard to secure communications properly even between two parties. Cryptography with a back door adds a third party, requiring a more complex protocol, and as Bellovin puts it: “Many previous attempts to add such features have resulted in new, easily exploited security flaws rather than better law enforcement access.”

    It doesn’t end there. Bellovin notes:

    Complexity in the protocols isn’t the only problem; protocols require computer programs to implement them, and more complex code generally creates more exploitable bugs. In the most notorious incident of this type, a cell phone switch in Greece was hacked by an unknown party. The so-called ‘lawful intercept’ mechanisms in the switch — that is, the features designed to permit the police to wiretap calls easily — was abused by the attacker to monitor at least a hundred cell phones, up to and including the prime minister’s. This attack would not have been possible if the vendor hadn’t written the lawful intercept code.

    More recently, as security researcher Susan Landau explains, “an IBM researcher found that a Cisco wiretapping architecture designed to accommodate law-enforcement requirements — a system already in use by major carriers — had numerous security holes in its design. This would have made it easy to break into the communications network and surreptitiously wiretap private communications.”

    The same is true for Google, which had its “compliance” technologies hacked by China.

    This isn’t just a problem for you and me and millions of companies that need secure communications. What will the government itself use for secure communications? The FBI and other government agencies currently use many commercial products — the same ones they want to force to have a back door. How will the FBI stop people from un-backdooring their deployments? Or does the government plan to stop using commercial communications technologies altogether?

  2. It won’t stop the bad guys. Users who want strong encryption will be able to get it — from Germany, Finland, Israel, and many other places in the world where it’s offered for sale and for free. In 1996, the National Research Council did a study called “Cryptography’s Role in Securing the Information Society,” nicknamed CRISIS. Here’s what they said:

    Products using unescrowed encryption are in use today by millions of users, and such products are available from many difficult-to-censor Internet sites abroad. Users could pre-encrypt their data, using whatever means were available, before their data were accepted by an escrowed encryption device or system. Users could store their data on remote computers, accessible through the click of a mouse but otherwise unknown to anyone but the data owner, such practices could occur quite legally even with a ban on the use of unescrowed encryption. Knowledge of strong encryption techniques is available from official U.S. government publications and other sources worldwide, and experts understanding how to use such knowledge might well be in high demand from criminal elements. — CRISIS Report at 303

    None of that has changed. And of course, more encryption technology is more readily available today than it was in 1996.

  3. It will harm innovation. In order to ensure that no “untappable” technology exists, we’ll likely see a technology mandate and a draconian regulatory framework. The implications of this for America’s leadership in innovation are dire. Could Mark Zuckerberg have built Facebook in his dorm room if he’d had to build in surveillance capabilities before launch in order to avoid government fines? Would Skype have ever happened if it had been forced to include an artificial bottleneck to allow government easy access to all of your peer-to-peer communications?

    This has especially serious implications for the open source community and small innovators. Some open source developers have already taken a stand against building back doors into software.

  4. It will harm US business. If, thanks to this proposal, US businesses cannot innovate and cannot offer truly secure products, we’re just handing business over to foreign companies who don’t have such limitations. Nokia, Siemens, and Ericsson would all be happy to take a heaping share of the communications technology business from US companies. And it’s not just telecom carriers and VOIP providers at risk. Many game consoles that people can use to play over the Internet, such as the Xbox, allow gamers to chat with each other while they play. They’d have to be tappable, too.
  5. It will cost consumers. Any additional mandates on service providers will require them to spend millions of dollars making their technologies compliant with the new rules. And there’s no real question about who will foot the bill: the providers will pass those costs onto their customers. (And of course, if the government were to pay for it, they would be using taxpayer dollars.)
  6. It will be unconstitutional.. Of course, we wouldn’t be EFF if we didn’t point out the myriad constitutional problems. The details of how a cryptography regulation or mandate will be unconstitutional may vary, but there are serious problems with nearly every iteration of a “no encryption allowed” proposal that we’ve seen so far. Some likely problems:
    • The First Amendment would likely be violated by a ban on all fully encrypted speech.
    • The First Amendment would likely not allow a ban of any software that can allow untappable secrecy. Software is speech, after all, and this is one of the key ways we defeated this bad idea last time.
    • The Fourth Amendment would not allow requiring disclosure of a key to the backdoor into our houses so the government can read our “papers” in advance of a showing of probable cause, and our digital communications shouldn’t be treated any differently.
    • The Fifth Amendment would be implicated by required disclosure of a private papers and the forced utterance of incriminating testimony.
    • Right to privacy. Both the right to be left alone and informational privacy rights would be implicated.
  7. It will be a huge outlay of tax dollars. As noted below, wiretapping is still a relatively rare tool of government. Yet the tax dollars needed to create a huge regulatory infrastructure staffed with government bureaucrats who can enforce the mandates will be very high. So, the taxpayers would end up paying for more expensive technology, higher taxes, and lost privacy, all for the relatively rare chance that motivated criminals will act “in the clear” by not using encryption readily available from a German or Israeli company or for free online.
  8. The government hasn’t shown that encryption is a problem. How many investigations have been thwarted or significantly harmed by encryption that could not be broken? In 2009, the government reported only one instance of encryption that they needed to break out of 2,376 court-approved wiretaps, and it ultimately didn’t prevent investigators from obtaining the communications they were after.

    The New York Times reports that the government officials pushing for this have only come up with a few examples (and it’s not clear that all of the examples actually involve encryption) and no real facts that would allow independent investigation or confirmation. More examples will undoubtedly surface in the FBI’s PR campaign, but we’ll be watching closely to see if underneath all the scary hype there’s actually a real problem demanding this expensive, intrusive solution.

The real issue with encryption may simply be that the FBI has to use more resources when they encounter it than when they don’t. Indeed, Bellovin argues: “Time has also shown that the government has almost always managed to go around encryption.” (One circumvention that’s worked before: keyloggers.) But if the FBI’s burden is the real issue here, then the words of the CRISIS Report are even truer today than they were in 1996:

It is true that the spread of encryption technologies will add to the burden of those in government who are charged with carrying out certain law enforcement and intelligence activities. But the many benefits to society of widespread commercial and private use of cryptography outweigh the disadvantages.

For several years now, I have smugly been talking about the weak privacy standards of Google and Facebook, confident that my providers were better than that. Well, it turns out that Yahoo is guilty of the same things. Yes, I use the Yahoo webmail service and I’ve been very happy with it. And, yes, I strongly recommend that everyone have a personal webmail account that is unconnected to your current work email.

Anyway, about three months ago, Yahoo launched several information sharing services. If you use the Yahoo Contacts feature, other people in your address book would be able to see what you’ve been up to – postings, connections and other activities within the Yahoo sites. And you can see information about them.

In principal, I have nothing against features that let us share information with others. My problem is the underhanded way that these companies roll the new features out. I never received any announcement about them and certainly got no training on my options to control the information they would be sharing. Worse, the default settings are “share all”. You have to know to look for and then take deliberate action to restrict the sharing. I didn’t even notice the change for months. If these companies really cared about security, the defaults would be rolled out the other way.

If you are a Yahoo user and you use their Contacts feature, here’s how to lock the program back down:

  1. Log onto your Yahoo Mail account.
  2. Click the Contacts tab at top left.
  3. Click the Tools dropdown and select ‘Seeing Updates from …’
  4. For a full lockdown, uncheck both the master settings at the top of the screen (‘Share my Updates’ and ‘See Updates in Yahoo Mail’)

If you like the sharing but want to restrict it to the people you are actually close with (rather than every random business contact that you’ve ever added to your Blackberry), go through the list and select the ‘Stop Getting Updates’ at the right of the contact’s name. You can also get a little more granular control using the ‘Manage my Updates’ link near the top left of the page. But blocking everything is easier.

The Yahoo Calendar also has some Sharing settings but since I don’t use their calendar feature, I don’t have good advice for how to lock it down. Any suggestions from people who do use it?