Wednesday, February 15, 2017

Internet Privacy: Who Exactly Has the Key?

Futurist Gerd Leonhard took the perspective of what he termed a “nowist” in his TedX discussion of Digital Ethics.  Why?  Because he is concerned that if we do not fully consider the ethics of technology now, “We may be facing threat of extinction by our own inventions.”  Technology, unlike humans, can move at warp speed.  It does not have ethics.  Gerd notes that we are allowing technology to track us, measure us, advise us, and connect us.  If we do not impart some clear ethical considerations now we could end up in a future driven by the exponential rather than the humanential; one that values speed, power and the network over depth, meaning and self-realization. We are at a critical juncture where much of what we have taken for granted, for example, our right to a degree of privacy is becoming less and less a guarantee.

Internet privacy has become enough of an area of ethical concern that those in the know just celebrated Data Privacy Day on January 18.  Although Deborah Johnson, as quoted in Terrell Bynum’s discussion of Information Ethics sees the privacy concerns as simply “new species of old moral issues” and not some, “wholly new ethics problems requiring additions to traditional ethical theories,” the situation definitely requires our attention – new theories or not. Throughout the world governments and organizations seem pitted against civil rights and privacy groups in determining the proper balance between privacy and security. Is privacy an absolute human right?  Can security and privacy dwell together?  What about freedom of information?

Had I written this piece 5 years ago I would have focused solely on our rights to online privacy such as protecting our data, our identities, and our employers’ rights (or not) to our social media presence.  However, in 2017, internet privacy expands to the Internet of Things (IoT); our wearables, cars, thermostats, entertainment systems … all of which are sending non-stop streams of our personal data into the network.  Since all of us engaging with IoT are now integrated into the internet, this blog will take an overview of privacy concerns with the internet per se as well as with the IoT.


A January 2017 internet privacy update from  VPN service provider Express VPN notes that the U.S. and the European Union have agreed that stealing, which includes violating someone’s privacy is wrong and illegal.  However, in assessing some of the recent cases where Facebook users felt their privacy had been invaded, the determination was that although the information had been used in ways the users were not expecting, no privacy laws had been broken.   This type of “surprise usage” of our private data is considered a “creep factor”.  So although what a company may be doing with our data is legal, in order to be truly ethical, a company should be clear about how they’re using it. Otherwise they risk losing our trust.  And talk about the creep factor.  Consider our right “to be forgotten”.  Our on line communication and postings do not go away.  They are our legacy – wanted or not – except in the European Union who in May 2014, ruled that its citizens had a “right to be forgotten” and directed Google to delete “inadequate, irrelevant or no longer relevant” pages from its search results. 

And although there are restrictions on how information may be used, the majority of world governments tap internet traffic as part of national security programs.  However, it is just this level of surveillance—whether by governments or corporations – that concerns citizen advocate groups like the ACLU who fear the extent to which such scrutiny can hamper  the expression of free speech and association, the free exercise of religion and undermine a free media.  In a related article the example of Boston residents seeking to stop the police from continuing a digital surveillance plan after finding out that the surveillance companies with which the police were contracting advertised themselves as helping law enforcement officials avoid the warrant process in investigations” and “providing a means of spying on dissidents.” 

Late last year the Federal Communications Commission (FCC) by a 3-2 party line vote  passed some new rules, which requires Internet providers to obtain a customers’ explicit consent before the provider can share personal information, such as app and browsing histories, mobile location data and other information generated while using the Internet.  The mandate requires that service providers inform consumers about data they collect and notify them of data breaches.  Covered in these restrictions are trading in health data, financial information, Social Security numbers and the content of emails and other digital messages.  Echoing Gerd Leonhard’s warning that ethical considerations need to be addressed as early as possible,  Jay Stanley  with the ACLU said “If this was not done, it could have really hard-wired a surveillance infrastructure into the Internet itself”.

Not covered by the ruling are individual web companies – such as Google or Facebook who already make a great deal of money monetizing user information.  The internet service providers view this information as a source of revenue thus a legal challenge from affected companies can be anticipated.  


A December 2016 article by Andrew Meola in Business Insider suggests that there will be 24 billion IoT devices by 2020.  And along with the benefits comes risk as these connected devices give hackers and cyber criminals more entry points.  The IoT also generates a great deal of very personal data which is now also part of the internet.  Meola mentions several key concerns including:

·         The sheer volume of data:  It’s estimated that 10,000 households can generate 150 million discrete data points every day.
·         Public Profile:  An insurance company can potentially collect data from your fitness tracker or smart car and use it to determine eligibility.
·         Access to your home:  German researchers were found to have collected unencrypted data from a smart meter device to determine what television shows people were watching.

Lauren Zanolli writing for Fast Company feels we are generally underplaying the threat to our privacy that IoT’s present.  She quotes Josh  Corman, a security expert and cofounder of I Am The Cavalry,  suggesting,  "What we've done is blindly assume that [adding software and connectivity] is always good. And we're making really horrible, horrible choices."  Continuing to echo these concerns, Lee Tien, Senior Staff Attorney at the Electronic Frontier Foundation points to the faded lines between private and public data stores in the post-9/11 era stating,  "We’re long past the days when we can really think of private sector collection of data and government collection of data as two separate silos."

In 2013, which seems like a generation ago when considering the IoT, the FTC met to discuss the Fair Information Practice Principles (“FIPPs”).  The FIPP have formed the basis of regulations for government and private sector initiatives on privacy for a variety of principles including notice, data minimization, choice, and security.  Their goal was to determine how FIPP should apply to IoT.  They determined four priorities. 

·         Build strong security in from the beginning and keep it strong.

·         Minimize data by collecting as little as possible and getting rid of it as soon as possible.

·         Ensure consumers know what data is being collected and,

·         Allowing them to opt out.

An example of poor security cited in the report included TRENDnet who marketed its Internet-connected cameras for varied home security purposes including baby monitoring.  They claimed their cameras were “secure.”   However, there were numerous security breaches covered in the report which indicated that hackers were able to access live feeds from consumers’ security cameras and conduct “unauthorized surveillance of infants sleeping in their cribs, young children playing, and adults engaging in typical daily activities.”

Kelsey Clubb, Lisa Kirch, and Nital Patwa from Berkeley in 2015 considered the ethical implications of IoT and their research suggested that IoT actually becomes the internet of behavior.  We are allowing access to all kinds of personal data from our heart rate to conversations, journals to driving habits, reading to traveling.  More and more of our behavior is there to be viewed and assessed and ultimately it could be used against us in court.  It could influence our applying for insurance, housing or a job.  And the protections in place are only partially protective.  In terms of the ethics of the data, who exactly owns this dataset and who has access to this dataset?  Not determined. What are the business practices of the companies that have access to the dataset?  Hopefully ethical.  How is the dataset is being used and being shared? There are limited parameters.  Oh yes and what happens when the system is hacked?  Oops!!

As consumers we really need to be aware and vigilant.  Although we would like to imagine everyone as being ethical our privacy is largely our responsibility.  Laws can help but they can’t keep up with the pace of change.  Consumer organizations like I Am The Cavalry and Electronic Frontier Foundation  can help keep us apprised.  Research projects like Ubiquitous Commons seek to design legal and technological toolkits that will help us to better control use of our data.   But ultimately it is up to you and I to be aware and know that we are making choices to share (or not) multiple times a day.   And honestly – the whole concept of privacy may be a mirage.


  1. "...And honestly – the whole concept of privacy may be a mirage."

    Very true. Just yesterday, Yahoo reported a third breach of their network. The technology for driverless cars is coming much faster than the security safeguards against car hacking. Many are adding voice activated devices to their homes like Echo or Google Assistant, but are they aware of how their data is being gathered, used, and sold?

    I wonder if this genie is already out of the bottle? Balancing privacy against gains in human augmentation is going to be difficult.

  2. Dr. Watwood - I tend to think it has escaped the bottle and its magic potion(a mix of good and evil)swirling all about. The laws to protect us seem quaint when compared to the robustness of the internet and the IoT. Giving up our privacy via the methods you mention (and more) seems in some ways a small price to pay for all the conveniences (and some necessities) of the connected world. "SEEMS" a small price. Ultimately it may leave us broke. On that sad note ... I'd better ask my Echo for some upbeat music ;-)

  3. Tricia,

    Fantastic post! Thank you for highlighting these critical aspects of privacy and ethics, and sharing so many great resources. Your point that we must consider the ethics of technology is very insightful. Gartner analyst Frank Buytendijk noted that YouTube was built by not paying much attention to copyright laws, and that Facebook continues to test privacy boundaries ( Buytendijk used Kohlberg’s moral stages of development to describe the ethical levels that businesses should strive to attain. The lowest level is compliance—the rules need to be followed. Moving to risk, technology is used in a manner that minimizes negative consequences. Next, good behavior is an opportunity to positively showcase and differentiate a business. Finally, at the highest level, intrinsic motivation pushes a business to do the right thing. You make an excellent case that good digital choices should be part of everyone’s DNA. As leaders, our good choices can help businesses attain the highest ethical levels!


  4. Thanks CatOnKB! Thanks for another great resource. Really appreciate the Kohlberg moral stages approach. That's a super guideline to use as a leader lens. There are times when individual moral behavior does not seem mirrored in the larger organization but this presents a simple map for scaling up individual standards. Much appreciated ~Tricia

  5. Tricia,
    Another great post!
    I think that the genie(s) is/are out of the bottle. We are desperately terrified of the misuse of our data, but live is a world where bad guys are trying to steal our data every day. As I mentioned in my post, we often must listen in on some good guys, in order to get the bad guys. I am not sure that there are grand rules that stand as principles to guide all situations. When the FBI wanted Apple to help break the encryption on an iPhone used by terrorist Syed Rizwan Farook (from the San Bernadino shooting), the wagons were circled around principles. I fell on the side of Bill Gates, who said that they should help since it was a specific case where the government asked for specific (not general) access to information it needed. His quote was memorable, "It is no different than [the question of] should anybody ever have been able to tell the phone company to get information, should anybody be able to get at bank records. Let's say the bank had tied a ribbon round the disk drive and said, 'Don't make me cut this ribbon because you'll make me cut it many times'." ( In the end, they didn’t help, but the FBI got to the information anyway.
    I find it awfully depressing not to trust (most) people. I insist, in line with our Jesuit education, that God is in all souls – and should win against evil most of the time. I am not naïve of the dangers, but tend to trust the elected government stewards in their efforts to protect us (citizens). What recently terrified me was the situation with General Flynn being relieved of his job. I understand that, if he was deceptive to his superiors, then that is easily grounds to be let go. But my fear was how a discussion he had with a Russian ambassador, was leaked publicly. He, as a U.S. citizen, should have had his identity protected – by law. But folks in the intelligence community decided to leak the information out. So, privacy laws were broken, and the Russians now know that we have a way to intercept phone calls made by their ambassador. Although I am sure that 99% of the employees in the intelligence communities are honest and upright folks, whatever their politics, these few have now truly brought an Orwellian nightmare back to our dreams: an unelected, secret force, that operates above the law. And their target seemed to be the actual elected government stewards who should be their masters. Whether folks were happy to see General Flynn out, or disappointed, everyone should be concerned about how it happened. For those that were okay with the series of events are today’s ganders, but may be tomorrow’s geese.

    1. Shawn - Thanks for your comments. I also believe in our inherent good nature however also believe that in a number of cases human goodness becomes clouded by justifications for less than moral behavior. It seems General Flynn's treatment was not just. And along with his situation is the interference by a foreign government into our election ... and on and on. And look how technology and human nature are intertwined. This is our future. And I wonder what our rights to privacy will look like 10 years out. Some of us will remember a before and after. I think we are on the cusp of change in this realm. Some for the good and some that you and I may not feel entirely comfortable with. Thanks for engaging. ~Tricia

  6. Tricia, wonderful post. I followed the Apple case Peopleologist referenced above with great interest last year. While I am always cognizant of what I am putting out there, where specifically, and with what understanding of privacy, I generally lean toward less to avoid having to pick through the fine print for every system. I also consider everything from the view of “if it became public,” would I be fine? This means possibly I do not take advantage of a lot of features or tools, but the basic assumption that anything could be “leaked” is part of my decision-making.

    With regard to the Apple case, I landed on Apple’s side. As I understand it, it was not about hacking a specific phone, which they had done many times in the past when asked. It was about creating a flaw that did not exist, which could be applied indiscriminately to many phones by anyone (if leaked and in the wrong hands). Apple had tried to make its consumers safer with advanced protection that even it could not get around, and creating what the government wanted would remove those protections for everyone not just on the one phone it wanted access to. Zetter (2016) noted that the issue raised an interesting question. With regard to privacy and security, companies have been focused on protecting data from external attacks. This issue forces companies like Apple to consider how they might protect customers from the company itself.

    Zetter, K. (2016, February 18). Apple’s FBI battle is complicated. Here’s what’s really going on. Wired. Retrieved from

    1. Julie - thanks for your thoughts. The Apple case is an excellent one and I appreciated that you and Peopleologist both brought it in and took different sides. I also followed that case last year and felt like I was on a debate team because as it unfolded I felt as if I could make fairly good arguments either way. In the end I admired Apple's position. I could also argue against it, but like you, it was the one in which I found balance in the scales of justice. Appreciate your connecting. ~Tricia

  7. Tricia,

    One of my favorite TED Talks to show my students is related to Internet privacy. The speaker outlines the relationship between what you like and what you are revealing about yourself online. She suggested figuring out someone’s education level could be determined by looking at attributes that have nothing to do with schooling. She said these connections could be explained by the theory of homophily. In other words, people that hang around others that have similar interests, so they often like the same things. Therefore, liking a page like curly fries could tell a social media organization a lot about your education level. It is a fascinating TED Talk if you have ten minutes.


    Golbeck, J. (2013). The curly conundrum: Why social media “likes” say more than you might think. [Ted Talk]. Retrieved from: