Home News Ethical Big Data Practice

Ethical Big Data Practice

Editor’s Note: The following article is written by Matthew Johnson, managing director for Dell Software. He talks about the need to manage Big Data well. 

Dell_plain_185px

 

 

As big data becomes further entrenched as a mainstream challenge for organizations both large and small, much of the industry discussion continues to revolve around two critical business needs: the need to find more effective ways to manage the explosion of data and data types, and the need to better capitalize on the opportunity this proliferation affords to make smarter, more insightful decisions.

Amid this justifiable fixation, however, another critical need is often overlooked: the need to build an ethical big data practice, one with proper sensitivity to the privacy concerns of customers.

The ethics of big data is a complicated subject, with much of the complexity stemming from the whims inherent within the concept of privacy itself. What exactly is privacy, and more important, what are an individual’s rights to privacy? The boundaries of privacy differ across cultures, and although it’s generally understood that individuals are entitled to some level of privacy, the question of whose responsibility it is to protect that privacy has no single, clearly definable answer. Matthew Johnston, managing director for South Asia, Dell Software

The issue becomes even murkier when dealing with the nuances of information exchanged between customer and business. In some cases — namely, with data disclosed to service providers such as doctors and lawyers — the responsibility has always been on the receiver to protect the privacy of that information. In the case of customer-business relationships that are more transactional, however, such as with a retailer, the protection of private information has historically been seen as the responsibility of the discloser. If people don’t want information exposed to a given business, they simply shouldn’t provide those details in the course of their dealings with that business.

There was a time when that concept made sense. In the big data era, however, that’s no longer the case. Digital information is fluid, its exchange is simple, and its distribution is instantaneous, global, and increasingly essential. Anyone in marketing will attest to the importance of understanding individual consumer identity and patterns, making the need to obtain detailed information about a given customer’s background and behaviour critical to the survival of many modern businesses. In other words, businesses aren’t just receiving information anymore; they’re using information, with a purpose.

The significant transformation inherent in businesses’ need to benefit from personal information means the responsibility to ensure privacy has permanently transitioned from the discloser to the user, and that means organizations must take immediate steps to ensure their burgeoning big data programs are implemented with the ethics of privacy in mind. Though the blueprint for doing so is not nearly complete, here are four steps that can help your business move in the right direction.

1. Understand the risk factors
Many big data privacy breaches are unintentional and unbeknownst to analysts.
That’s why it’s critical for everyone involved in a big data initiative to understand the risks associated with the handling of customer information.

Perhaps the biggest risk lies in the merging of purchased data with other pattern data to infer or detect non-disclosed information — information that may be private in the eyes of the customer. To ensure an ethical practice, target customers solely based on information disclosed within any single transfer of data.

2. Educate users
Implementing policies that prevent employees from detecting non-disclosed information is only part of the equation. Employees must still take personal responsibility for what they do with customer data, even data obtained in an ethical manner. It’s imperative for a company to educate its staff on what level of targeting is acceptable, and, conversely, where the gray areas are that can potentially be harmful to both the customer and the company’s brand.

3. Design and use tools with privacy in mind
This step applies to both the vendors creating analytical tools and the analysts using them. Today’s big data analytics technologies are undeniably powerful and are constantly improving, enabling analysts to reach further than they ever could under the do-not-disclose privacy paradigm. As the saying goes, “with great power comes great responsibility”. Without careful consideration by both the designer and user of a given analytical system, overstepping acceptable boundaries can and will happen.

4. Lead from the top down
Employees take their cues from company leaders, and big data is no exception.
Top-level executives and business leaders must make clear to the company’s analysts, marketers, and line-of-business managers that it is not acceptable to achieve company objectives at the expense of customer privacy.

The need to derive business value from big data is paramount, but in a world where technology regularly outpaces social norms, and customers haven’t yet had enough time to express their comfort zones over the use of their personal and usage pattern data, the responsibility is on business leaders to demand employees follow an ethical path.

Ads by Google