Read the fine print
It sometimes feels as if every day there is a new revelation regarding just how innovative some organizations have become at finding new ways to break their customers’ trust, while profiting from their personal data. While providers might argue that their terms of service “clearly” state they have a right to do so, do we really believe that most people truly understand them?
Let’s be honest
It made me think. What if terms of service were written in plain language that broadcasted the service provider’s intent? What would that look like? So, in the spirit of the “Honest Trailers” film trailer parodies I decided to write an “Honest Terms of Service”. The list my vent generated was long, so here are just a few of them.
Honest Terms of Service
These terms of service are intentionally written in legalese such that, even if you understand them, they will not help you realize their full implications. They are long and blandly written and, hey, don’t you have better things to do? If you are breathing, you have consented to these terms of service. Let’s be honest, you didn’t even make it this far.
– We will begin building an extremely detailed profile of you whether or not you use our service.
– If you become a user, we will aggregate all of your new user data with what we have already collected.
– We will sell all of your data and derived insights to anyone who will pay for it, regardless of their intent or associations. We will continue to sell your information to them regardless of how they use it.
– We will offer an app that lets you have “private” conversations with others and give you the impression those conversations will never be shared. We will mine and sell that information.
– We will listen to your audio conversations. We will transcribe this information. We will use or sell what we learn from it.
– We will offer you the convenience of using our ID to login to other services. We will keep a detailed record of every service you use and when. We will aggregate that with everything else we know about you and sell it.
– We will capture your images from photos you post, whether restricted or publicly, and sell them to facial recognition companies so they can identify you. They will sell the derived information.
– We will track your location and interests and use that to determine whom you know. We’ll also pull that from sources such as your friend and contact lists. We’ll be watching them too.
– We will publish non-binding statements about how we value you as a customer and respect your privacy. While it is true we value your potential as a revenue source, we value profit over your privacy.
Though these terms were inspired by actual events, my point is to highlight the extent of this issue, not to name and shame any organization.
Clear terms of service are not enough
Even if we could envision a world where terms of service would be read by everyone and truly understood, could they ever truly be effective? Would they ever result in the type of digital privacy that people will eventually demand? I do not believe so.
A model for trust
There are currently a number of emerging digital trust frameworks that such as DIACC’s Pan-Canadian Trust Framework (PCTF) or the Citizen First Joint Council’s PCTF Public Sector Profile that are working to address this. These frameworks help set a bar for digital interaction that should enable us to better live up to our digital potential. Conformance to the guidance from such frameworks should at the very least enable us to better gauge which organizations we can trust, to what standard of identity and privacy they operate and, to some extent, how they value individuals’ privacy.
In fact, it was my review of the latest PCTF “Notice and Consent” criteria that led me to think about these “honest terms of service”. To achieve compliance, the Framework demands clear communication from providers to users regarding things such as the specific data they intend to use or share and how it will be used or shared (“Notice”). It also requires providers to solicit explicit permission to use the data, and acknowledgement of those intents, parameters, and constraints from individuals before the data can be used (“Consent”). There are some exceptions, though it’s a great beginning.
You can learn more by reviewing these emerging frameworks yourself. These groups are always looking for feedback and new members. Check them out, and help us move toward a more effective, productive, privacy-respecting digital economy.
About the author
George Watt is a Partner at Becker-Carroll, a Converge Company, and is responsible for Strategy and Lean Innovation. A transformative leader, George has spearheaded initiatives that have enabled businesses and global enterprises to address complex technology problems, deliver new business benefits, and drive millions of dollars in savings and productivity gains. He has delivered innovations of his own such as a knowledge base for a neural network-based predictive performance management solution, one of the earliest private clouds (2005), and a lightweight event management agent.
As VP of Strategy for a multibillion-dollar technology company, he was responsible for global scientific research, worldwide innovation initiatives, and the design and operation of an innovative startup accelerator program. He has held many national and global leadership positions and has led global teams spanning North America, Europe, Asia, and Australia. George is co-author of “The Innovative CIO” and “Lean Entrepreneurship” and tweets as @GeorgeDWatt.