Anonymity doesn’t work so how can we have privacy?

By | Privacy, Startup general interest | 8 Comments

Secret is the hot new app in the Valley. People read it for the gossip, and people post gossip there because they post anonymously. Last week people were even chatting on Twitter about the amazing gossip on Secret!

Sam Altman of Y Combinator wrote this in a post about Secret:

Anonymity breeds meanness–the Internet has proven this time and time again.  People are willing to say nice or neutral things with their name attached–they need anonymity for mean things and things they are embarrassed about.  In fact, the closer to real identity internet forums get, the less they seem to decay.  Anonymous social networks have been (thus far, anyway) in the category of services that get worse as they get bigger–unlike services like Facebook or Twitter that get better as they get bigger.

This matches my experience of web communities. Trolls are always anonymous.

I think this observation leaves us with two theoretical choices for the future – either we have large online communities where people use their real identities with everything that means for privacy, or we don’t have online communities at all. However, the genie is out of the bottle and online communities are here to stay, meaning option two isn’t really an option at all, and we are destined to live in a world where people share stuff online using their real identities.

The interesting question, then, is what that world looks like, and the recent rise of Snapchat and other ephemeral messaging services can be understood as a move to share online more privately. I think we will see more of this sort of thing over time as society searches for a new equilibrium for privacy and sharing that works in the digital age. 

Has the NSA pushed people to finally do something about their privacy concerns?

By | Privacy | 2 Comments

Screen Shot 2013-09-13 at 08.30.49

A -
B - SF billboard
C -
D - TIME's Top 50 Websites
E - USV investment
F - Visual Refresh
G - Google privacy policy change
H - Washington Post profile
I - Surveillance revelations



The chart above shows that query volume on privacy friendly search engine DuckDuckGo has risen dramatically since the NSA scandal broke.

It has been the case for some time now that when asked people said they cared deeply about privacy but that there was no evidence of that caring in their actions. In fact despite saying they were worried about misuse of their personal data people continued to share more data on social media and use more services which captured their personal data thus increasing their exposure to identity theft and other abuses.

This data suggests the NSA scandal is pushing large numbers of people to a tipping point. If so we may be approaching the watershed moment many privacy advocates have been predicting for some time.

I’m looking forward to seeing how this plays out.




The end of privacy by obscurity

By | Privacy | No Comments

A penny dropped for me when I read Facebook’s graph search and the end of privacy by obscurity on GigaOM recently. Regular readers will know that I’m of the opinion that we have more to gain than to lose by sharing our information, and that I hope that people with privacy concerns will slowly get over them as the benefits of sharing become clearer. However, when I sought to understand whether that hope was realistic I struggled with the diffuse and unfocused nature of people’s privacy concerns. They would say that they don’t like the idea of people being able to find out what they have been doing, but weren’t really able to say why that bothered them. I now see that the concern is not so much with privacy but with losing their obscurity.

In other words, the issue is not with the information that gets shared so much as the fact that it makes people visible. Some people (myself included) like to be visible, but other people find it uncomfortable. I think that discomfort is what leads many people to be concerned about their privacy in a Facebook and Google world.

These people do, or course, have the choice of not using Facebook and Google, but I think it’s pretty clear that for the vast majority their usefulness is worth the unease that comes with being more visible. That means that rather than hope that people with privacy concerns will get over them, I should hope that they get used to their increased visibility and it stops bothering them. After that the privacy concerns will go of their own accord.

Ad-blocking by ISPs

By | Advertising, Privacy | 2 Comments

I’m just back from an extended Xmas break and have been enjoying getting back into the news flow and thinking about markets and opportunities. The juiciest titbit this morning is undoubtedly the news that upstart French ISP Free is now shipping ad-blocking software and hardware to its customers with defaults set so all ads are blocked. Free has 5.2m subscribers and is the second biggest ISP in France, making this a significant move.

Free’s motivation for blocking ads is to re-open the debate about ISPs charging content providers to carry their content. Network operators have long been sore that they make slim or no profits because they have to carry so much traffic for Google whose profit margins are huge, but regulators insist that they treat all content the same and won’t allow them to charge content owners for transit or for priority treatment (i.e. they insist on Net Neutrality). Blocking ads throttles the revenues of content owners like Google and might force them to start discussing carriage fees.

My purpose here isn’t to revisit the arguments on favour of net neutrality (I’m in favour) but to observe that ad-blocking might shift the balance of power in the struggle between content owners and ISPs and to note that it is a dangerous game that Free is playing. The interesting thing about ISPs blocking ads is that it can be dressed up as a pro-consumer move, reversing the positioning of the net neutrality debate where content owners were the consumer’s champion offering free services that were threatened by money grabbing ISPs. The reality of course is that for the ecosystem to work everybody needs to make at least a small profit, otherwise they will cease business. In theory the market is the mechanism which determines prices in such a way that everyone earns a fair return on their investment, but in practice there market inefficiencies are commonplace and regulation is often required. Then, once you have regulation, companies are operating in the court of public opinion as well as in the business of providing products and services, which can result in companies like Free taking actions which take money out of the ecosystem overall without any direct benefit to their own financials.

I just checked the NYT for an update and the latest news is that the French government has ordered Free to stop blocking ads. I guess that puts the genie back in the bottle for now, which is a good thing. Without ads the many, many, companies would suffer (including quite a few in our portfolio) and the internet as we know it would disappear. I acknowledge that many people’s privacy concerns aren’t adequately addressed in the current setup, but I don’t think the situation is so bad that we should wreck the entire web and start rebuilding from scratch.

Musings on the tension between trust and anonymity

By | Privacy | No Comments

Google just changed the Google Play store so that reviewers’ Google+ name and profile picture are visible, with no option for anonymity, writes Techcrunch.

I think this is a good move because identity engenders trust and the lack of trust on the web relative to offline is one of the things that holds people back from using and enjoying the web more. This is a complex issue and forcing users to reveal themselves is no panacea, but it helps with the biggest problem with reviews – people don’t know whether they are genuine.

I think this point about identity engendering trust extend beyond reviews to all online services. Offline we unthinkingly take stock of the people that we interact using a large number of non-verbal cues and identity enables us to do some of the same online. This is why interacting with people on Facebook feels better to most people than interacting with people on anonymous user groups. I think it is also why most people choose to have photos of themselves as their profile picture.

The other big benefit of forcing people to post under their real identity is that they generally behave more reasonably.

That said, this isn’t a black and white issue and there are situations where anonymity is a good thing. Whistleblowers and people reviewing products they don’t want to admit to owning are examples given in the comments to the Techcrunch post. Political rebels in authoritarian states are another obvious case. But to me these are edge cases and we shouldn’t design the architecture of the web around them. Moreover, whistleblowers and revolutionaries can find workarounds, including setting up fake accounts or using other services.

Finally, a word on the nightmare scenario of an authoritarian government using modern technology to control its population to nefarious ends. I guess my thoughts are that reducing anonymity increases the scope for abuse, but not meaningfully as for a while now there has been more than enough tech available for governments to take total control should they so wish. The right way to protect ourselves is to invest in and support our political systems rather than put controls on particular pieces of technology. I would argue that not only is this the right way, it is the only way, as any controls put in place by one government or regulator can be removed by another.

Measuring ad effectiveness by linking to offline sales

By | Advertising, Facebook, Privacy | 2 Comments

On Friday I wrote about how privacy advocates will welcome Facebook’s release of their Shared Activity plugin which makes it easier for users to control how their actions on Facebook-connected third party site show up in their Facebook feed. Today’s news points in the other direction. This morning I read of Facebook’s project with Datalogix to measure ad effectiveness by tracking whether people bought a product in a store after seeing a Facebook ad. They are using loyalty card data to make the link.

Predictably, privacy advocates are arguing that if Facebook is to go down this route they should only do it for consumers who explicitly opt-in.

These opt-in vs opt-out arguments are happening all the time now, and in some ways they are a bit of a charade. Very few people change their default settings and so rather than being about personal choice, deciding to make something opt-in is to effectively kill the project – i.e. to regulate so that something is opt-in is to regulate it out of existence, most times at least.

For me the advantages of linking online ads to offline sales far outweigh any risks. Better tracking leads to better targeting, allowing publishers to charge higher rates and show us fewer ads and/or offer us more stuff for free. Additionally, the ads we do see are more likely to be for something we want, and therefore less annoying.

There is a lot to be gained from this type of tracking, which is why Facebook’s advertising customers are pushing them down this route.

And I can’t see that there is much risk. Datalogix anonymises the data it buys from retailers so there is no way that Facebook can tell which of their users are actually buying what. For me, real privacy risks come when people can work out how to access my money or work out where I live, not from high level concerns that if the data is in existence something bad might happen. I’d love for someone to explain if there are any real concerns in this case.

Facebook makes privacy controls much simpler

By | Facebook, Privacy | One Comment

Last night Facebook launched their Shared Activity plugin makes it much easier to control which of your activities on Facebook-connected third party sites show up in your Facebook feed. It’s a one click process that runs on the third party site so you don’t have to go back to Facebook at all. Venturebeat has this description of how it works:

Imagine you’re browsing around your favorite news site, You’ve previously logged into the site using your Facebook profile, because once you’re logged in, you get to play social games with Justin Bieber themes, and how fun is that? But you’re not sure you want everyone on your Facebook friends list to see your activity, so you check the site’s Shared Activity plug-in, which is already hovering in the bottom left corner of the screen, nice and obvious, and you click “No one” on the drop-down menu of groups to share with.

As regular readers will know, I’m a big fan of social media in general and Facebook in particular. Aside from their bungled IPO I think they’ve had a very positive influence on the world. By giving everyone a voice they’ve increased transparency and accountability right across society making values like integrity and commitment to quality more important and reducing the value of spin.

It hasn’t been all roses though, and the number one criticism of Facebook as a product that it doesn’t adequately address people’s privacy concerns. The main thing people are worried about is other people seeing what they’ve been doing online and this plugin addresses that concern head on.

If you’ve been watching Facebook over the years you might well be noting that this is the first time they’ve deliberately made it easy for people to restrict the amount of information they share. One explanation for the change is that this move makes it more difficult for potential competitors who would need to match Facebook for privacy, which would mean less sharing and hence a harder time getting to critical mass. Getting to scale and then changing the rules of the game so others can’t copy you is a smart play. Another explanation is that they think the PR benefits from being privacy friendly will outweigh the negative of reduced traffic that will inevitably follow reduced sharing. The final explanation is that they have done this because they think their users want it. This would be nice to believe, but runs counter to behaviour we’ve seen from Zuck and co in the past.

Exposing the misunderstanding behind the belief that social media makes us less social

By | Privacy, Social networks | One Comment

Regular readers will know I’m a big believer in the power of social media as a force for good in our society. Now that everyone can publish their opinion and no-one can control the media integrity is becoming more and more important for brands and individuals, and I think that’s great.

There are many people who think differently however, the most recent example of which is MIT professor Sherry Turkle who wrote a piece in the New York Times arguing that all the connecting people are doing on Facebook comes at the expense of real conversations. In other words social media is bad because it weakens our relationships. This commentary from Turkle adds to a daily stream of stories from publications like the Daily Mail about the problems that social media is bringing to our society, from people embarrassing themselves by sharing inappropriately to people using Facebook to approach young girls for sex.

These stories irk me because they don’t reflect the truth of the matter, so I was pleased to see GigaOM produce a comprehensive analysis of how Turkle has misunderstood the role that social media plays in people’s lives.

As you can read in more detail on GigaOM, the flaw in Turkle’s reasoning is that status updates on Facebook and texting are not substituting for conversation or deep relationships, rather, they are compliments to other deeper, richer, forms of communication, including one-on-one conversations. In fact, the research shows that people who are more social online are also more social offline.

I think a lot of the negative sentiment surrounding social media is best understood as fear of the new and fear of change, and echoes the moral panics that have histroically accompanied new forms of media. I strongly believe that over time social media will become an accepted and valued part of the fabric of society in the same way as telephones, television and many other new technologies have before.

That said, there are many important concerns around social media that need to be addressed and probably regulated for, most obviously child safety and privacy. However, those concerns are best addressed in an atmosphere of calm and well informed debate. My fear, and reason for writing this post, is that we will get regulation driven more by fear than logic.


Google’s recent troubles give an insight into how much and why privacy matters

By | Privacy | 10 Comments

Danny Sulliivan put up a good post yesterday: On Google & Being “Evil”. His main point was that Google is now a very big company and that inevitably means they will make mistakes, including ones which impact their users’ privacy, but they are not any more evil (or good) than any other large company. I agree with that, they are becoming no different to other large vertically integrated service providers like Facebook and Apple. As Danny points out, there is one important historical difference though, and that is Google’s “Do no evil positioning”, which gives the worlds largest search engine much further to fall than its competitors.

Danny’s main point is interesting, but it is his description of Google’s recent problems and the government and public response to them which I am going to focus on today.

Let’s start with a recap on what Google has done.

Most importantly they are changing their privacy policies. If you use any Google services you will have seen a pop-up informing you of this fact, and if you are anything like me you will have seen so many pop-ups that you have started to get annoyed by them. Also, if you are anything like me (and 90% of the rest of the population) you won’t have read them, but you will have caught sight of various headlines suggesting that you should be worried about what these changes mean.

Beyond the privacy changes Google has made a few gaffs recently. Danny lists three:

The response to this has been markedly different in different parts of society. On the one hand politicians, journalists and the intelligentsia are outraged, whilst on the other hand the public doesn’t seem to care. The picture at the end of this post shows the top results on a Google News search for “Google privacy policy changes”. The negativity is clear for all to see. Yet Danny Sullivan reports that for the mass public:

there’s no mass movement to abandon Google. Take a tour of its help forums, as I’ve explained before. [Privacy] It’s not a huge topic.

I’ve been having an increasing number of conversations in recent weeks with folk (largely educated, wealthy, middle aged folk) who think that a privacy backlash is coming, but I just don’t see it. Further, when I press these folks on what the precise issue is, or what might precipitate a sudden elevation of this issue in the mind of Joe Public they have no answers.

However, politicians are now legislating for privacy and companies have to deal with that. The important thing to note though is that it is a regulatory issue, not a product issue. Hence internet companies need to deal with privacy issues in the same way as financial services companies have dealt with regulators for years. They need to lobby to make sure they don’t get blind-sided, and many will try to use regulation for competitive advantage.

I was at a dinner for our portfolio company StrikeAd last night. They operate in the mobile advertising industry where everybody is very concerned about the privacy implications of targeting and tracking users. As a result voluntary codes of conduct are being drawn up, they are being endorsed by standards bodies, and those bodies are then certifying vendors as compliant, and listing service providers as qualified to provide certification. If you want to play in certain segments of the market, for example ad verification, then you need to play the game and make sure you are on the list of vendors qualified to provide certification. In other words, for companies in this market lobbying and being part of the regulatory process is as important as it is for banks and telcos.

Privacy is important, and users need to be protected, but as you can see from the story in the paragraph above a lot is being done already to prevent abuse of personal data. I think we will see more regulation in this area and companies will devote increasing resources to making sure personal data is protected and only used in the right ways, but I see no evidence that the greater public cares enough that we will see anything more than that.


Enhanced by Zemanta

Facebook privacy questions should be judged by consumers not politicians

By | Facebook, Privacy | 9 Comments

As you may well have seen there is quite a furore at the moment about Facebook’s new privacy settings – see the articles today in the NYT, FT and GigaOM, and it does seem to me like Facebook could be doing a better job here, at least by making privacy settings simpler and communicating what they are doing better.

But that does not mean they should be forced to change their service by the regulator, as has happened already in Canada, and now seems a possibility in Europe.

Nobody is compelled to use Facebook and everybody has the option of not sharing any data, by closing their account.  To me this makes Facebook’s privacy settings an issue to be decided in the market by consumers not by politicians.  Google Street View is also getting a lot of criticism, but this case is different because everybody on Facebook chooses to be there.

Long time privacy advocate Alan Patrick wrote a blog post yesterday which talks about a competing service to Facebook which has been getting a lot of early support – if people care about privacy they will migrate to alternative social networks like the one Alan describes.

Meanwhile, within reason Facebook should have the right to build the service it chooses and do what it needs to do to make money.  Right now it seems that politicians are stepping into areas that should be left to Facebook.  For example in Germany they are talking about compelling Facebook to allow users to create accounts under pseudonyms, which runs contrary to Facebook’s philosophy since the start (see FT article), and in Europe generally they are seeking to regulate default privacy settings in a detailed manner (see PaidContent article).

This matter is naturally of huge concern to Facebook who are holding an all hands meeting on this subject today because they rightly fear that regulators might undermine their business.  It would be to the detriment of just about all of us if that were to happen.

A caveat to finish: protecting minors is a different matter to everything I’ve talked about here, and should be of prime concern to all of us, including governments and regulators.