Almost a year ago, I called out the Facebook Messenger app on Android for having insidious permissions settings. Unless you've been living under a rock, which, in modern day terms means you don't have access to the Internet, you've seen the furor that was created when that post went viral these past few weeks. Last week Facebook made good on its promise to force users to download a separate app if they wanted to continue to IM their friends using its platform, which, I assume, is the reason that my year-old post gained so much traction.
Unfortunately, I inaccurately used "Terms of Service" when referencing Messenger's permission settings on Android devices and that sent the resulting massive world-wide debate on a tangent. While I used the Facebook Messenger app as an example, the point of the article was to start a dialogue around the actual cost of free apps - all apps, not just the Messenger app.
I asked readers if they were like me at the time and blindly accepted permissions settings and terms of use clauses without reading them first. It seems most were.
I challenged everyone to read the small print and decide if they were OK with the cost of free. I've received literally thousands of emails and messages thanking me for raising the issue and received reports that most were now reading these settings and permission requirements and choosing to uninstall or simply not download various apps, including Messenger.
But did they really?
This past week Facebook Messenger became the number one app on iTunes despite the fact that it has received one of the lowest ratings (average 1-star) that any app can earn. Yet everyone is still downloading it just the same. The app has a better rating on Google Play; however, Google uses cumulative ratings (across all versions of an app) to report average ratings. If you take a look at the most recent ratings and comments you'll discover the sentiment is just as poor for the current version of the app on Google Play as it is on iTunes.
So why are people downloading an app that they clearly don't like from a company they don't trust?
It raises the question: Does anyone really care about the level of privacy they give up or the amount of risk they take when granting apps, software, and mobile devices more and more access to our lives, not to mention the access we implicitly grant to our contacts' data?
Does Anyone Really Care About Their Privacy?
Apparently the answer is "no." We need that next hit of social connectivity and we'll pay a pretty hefty - and personal - price for it. The drug/drug dealer analogy s not far fetched. We're addicted to technology and we're prepared to risk just about anything to stay that way, even when deep down we know we're doing something potentially wrong.
I debated the issue of apps and privacy with Robert Scoble and Bryan Kramer on a recent episode of The Social Faceoff. Scoble, who is the Startup Liaison Officer for Rackspace and popular technology evangelist, argued that we're "heading into a new age of context where we're going to be studied deeply by the smoke detector in our home, the car we're driving, the phone we're carrying, and the wearable tech we're wearing." On average, one-third of the audience that attends his presentations state they're worried about this yet they - and he - are still "all in." He argues that there's no middle ground; people understand the fact that if they choose to be all-out instead of all-in, the utility lost from not granting apps and devices access to personal data will make us losers "at the game of life...over and over again."
I am and have been a technology "all-in" person since that fateful day, long, long ago, when I created my first MySpace profile. I waded into the deep end when I migrated from my old Blackberry to my Google-powered Android smart phone. Yet, today, I'm increasingly concerned about the ramifications of blindly diving in and have begun to delete apps, revoke certain permissions, and/or have added additional layers of security software to my mobile devices.
User Privacy Concerns Are Being Dismissed
When I called out the Android-Messenger permissions, many suggested I was pushing people to put on their tin foil hats, referencing the fashion accessory of choice for conspiracy theorists. The point they were trying to make is that I'm being paranoid and that my fears are not based in reality or fact.
The reality is that technology enthusiasts and tech/social firms alike are dismissing the real fears most people have. Most casually acknowledge the risks and complain about them but that doesn't stop them from doing anything about it. That's likely because few think that security breaches will happen to them.
Cyber-criminals are increasingly exploiting social media sites and smart phones to embed themselves and their software into our devices and lives. Once a cybercriminal has managed to gain access to an individual's network of friends and family, he or she can then become friends with others to pilfer their information, according to a study by the University of Buffalo.
The combination of smartphones and social media is indispensable to us yet that dependency also offers fertile ground for hackers. We store so much data in our mobile device and grant it so much access without proper security that we make it surprisingly easy for our data and devices to be used against us.
According to Eva Velasquez, President and CEO at the Identity Theft Resource Center, there are numerous reports describing how "individuals [have] managed to hack into entire cellular provider's networks, just like ones who access the financial information of a major retailer." It's not just losing a phone or having someone hack into the actual device that's a threat. "Phone calls, emails, text messages, and other forms of communication can show up on their radar, providing them with pieces of the puzzle they need in order to steal your identity," or piggy back on the device's permissions for more nefarious activity.
Tech and Social Companies Shirking Their Responsibility
Another way to look at this is to ask what social networks like Facebook and its Messenger app aren't telling us? Currently, Messenger, as an example, requires permission to access our contact database so that, according to Facebook, it can tell us who among our contacts also has the Messenger app. In reality, it doesn't need to scan our phone's contact database to do this. Facebook already knows who we're friends with on Facebook and could easily use that info to allow the app to build an IM directory.
Why else would they want this access? It's quite possible that there is no ulterior motive. All the more reason these companies should explain how they will not use those permissions instead of just stating how they will. In Messenger's case, Facebook should confirm that the app will only use the contact list it scanned to generate a Messenger contact list and that it will not - and cannot - use it for any other purpose. But it doesn't.
When questioned in our debate, Scoble offered, "That's the huge problem as a software engineer, because I have no idea what I'm going to do with that data tomorrow, I really don't." That's exactly the point! They don't know and we don't know.
Software companies cannot predict what their app may need to do in the future and so placing restrictions on how they will access and use our data may affect their ability to develop future features. Fair enough; however, isn't that their responsibility? Should they not be expected to come back and ask us if they may need to use our data in another manner from that which we explicitly agreed to?
We know Facebook and Google (among others) track our usage patterns including who we speak to, when we speak to them, what we're sharing, what we're reading, and where we are every time we do so in order to sell that data to advertisers, which, in term, displays more relevant advertising. Is it a surprise that people are wondering what else they may be tracking and for what purpose? Given Facebook's history of communication missteps and secret experiments, this is not paranoia.
Can We Go Back?
Over-sharing is a national pastime thanks to the cultural and technology revolution that is social and mobile communications. While I'm not suggesting that we turn off all access to the world and burn our phones, corporations supplying the technology that keep us connected - and those who use them - must start paying more attention to the real risks that exist.
At the end of the day however, the onus is on you, not the tech/social companies, to protect yourself. Can we go back? No. We'll continue to rely on mobile devices and we'll increasingly need to share personal data to benefit from their utility. However, we can take better steps to protect ourselves.
More of us need to challenge social networks and app developers by reading permissions settings, terms of service, etc., and where we're not comfortable, we need to stand up and say no. You can install software such as Sophos Mobile Security (for Android), which scans apps for malware before you download them in order to protect your device and data. There is always something that can be done.
The call is yours. Are you at all concerned about your privacy anymore? Are you willing to take the risks associated with being "all-in" in context-based, hyper-connected world? Share your thoughts in the comments below.
Executive Network
Click Here, Learn More