FotolEdhar - Fotolia
Customer data governance policies: Stop stalking, start selling
Customer data governance policies differ from traditional data governance in several ways. Finding that line between building trust among customers and stalking them is a start.
In the 1964 case Jacobellis v. Ohio, U.S. Supreme Court Justice Potter Stewart famously said that obscenity is hard to define, but "I know it when I see it." In much the same way, it might be tough to demarcate between good use and abuse of customer data in efforts to drive sales, but we know abuse when we see it, as evidenced by these all-too-real events.
Wells Fargo. Last year, investigators found that employees had opened 2 million fake credit cards and deposit accounts on behalf of customers to meet aggressive sales quotas. The fallout included $185 million in fines, $2.6 million in refunds and thousands of employee firings. By March 2017, the bank's credit card applications were down 55% and new checking accounts declined 43%, compared to March 2016.
Target. The retailer divined, likely with predictive analytics, that a teenage customer was pregnant, which her father found out only after Target sent his daughter coupons for baby clothes and cribs, a store employee told The New York Times.
Bank of America. A feminist freelance writer received a credit card offer from Bank of America via direct mail with her middle name changed to a sexist pejorative. Pictures of the offer letter went viral in minutes after the writer's mother discovered it and sent smartphone pictures to her daughter, which she in turn tweeted.
If that's not enough, how about this one? A friend or relative uses your computer to Google search information on a product. Soon after, ads appear for that product on multiple media sites across all your devices, stalking you for weeks and leaving you with a very negative impression of that product. If the time ever comes, you might even purchase a competitor's product out of spite.
With all the marketing automation tools, analytics software and artificial intelligence (AI) systems available today, companies can personalize pitches more deeply than ever before. Furthermore, the capacity to collect and maintain big data stores empowers these tools to provide marketing leaders at both consumer and B2B companies a more detailed window into the minds of their customers.
Sometimes, however, it can get too personal.
New approach drives governance
Keeping on the right side of customer data use -- in the legal sense, ethical sense and not-scaring-away-customers sense -- takes new approaches to data governance policies. Weighing in the balance are lost sales; achieving internet notoriety for the wrong reasons; and even punishments from regulators for unlawfully using or exposing data relating to perhaps finances, health and even students' grades.
Daragh O Brien, founder and lead consultant of data management consultancy Castlebridge Associates, compared the line between legitimate and questionable uses of customer data to "the difference between a concerned friend and a creepy stalker." Exactly where to draw that line isn't cut-and-dried, O Brien said -- it depends on the specific data involved and the context in which it's being used.
"In a lot of cases, it comes down to basic ethics and a good feel," O Brien noted. If a company is just using customer data as a means to an end that serves its own purposes, customers might "get upset about the things you've done and start saying nasty things about you on social media," he warned. "You need to see the people you're serving as an end themselves and do things so you can serve them better."
Lesser sins against data accuracy also can turn customers off, pointed out Dan Power, managing director of data governance at State Street Corp.'s Global Markets unit. It doesn't take much over-automation to make a customer feel as if they're just a number and that you don't really care about their business. Not paying attention to updating an address when a customer moves, for example. Or, for corporate accounts, not updating files when a customer has undergone a merger -- and has been renamed in the process -- can contribute to that sentiment.
Tara Kellypresident and CEO, Splice Software
The good news is that customer data governance policies are likely already happening on an ad hoc basis in your organization, Power continued. School-of-hard-knocks stories where sales and service people have stepped over the line and lost customers are probably circulating around those camps, and maybe even some informal rules have been put in place by department leaders. The architects of those rules are a natural fit for a customer data governance team. It should also include representatives from both IT and the business side of the organization. Together, they can design and implement rules that make sense for all.
Reining in powerful tools
So where, exactly, is the line between helping and hurting your business when it comes to data mining? "The 'creepy zone' is very grey right now," said Jim Tyo, chief data officer at Nationwide Mutual Insurance Co. For example, getting product recommendations from Amazon based on your buying history can be useful when shopping online, Tyo said during his presentation at Enterprise Data World 2017 in Atlanta. "But if I go to Target and the checkout person is talking to me about that, it would be different. And we're not far away from that."
In an interview afterward, Tyo said Nationwide governs customer data containing personally identifiable information differently than other types of data that aren't as sensitive. "You have to," he said. Otherwise, "you can become the poster child for what not to do."
On the back end, Nationwide manages security and user access "in a much more comprehensive way" as part of its customer data governance policies, Tyo explained. That includes steps like encrypting data, controlling its use in development and test environments, and monitoring internal data sharing to make sure information is seen only by end users with a legitimate need to access it.
In addition, the Columbus, Ohio, insurer is "very cautious" about using certain data when interacting with customers, Tyo said. He pointed to telematics data collected from the vehicles of auto insurance customers as an example. Sensors installed in cars to collect operational data can transmit a broad amount of information on driving behavior, according to Tyo -- not only on acceleration and braking, but also on activities such as turning up the radio volume or changing other settings on the control panel.
Analyzing that data can provide insights on ways to fine-tune insurance services and programs, he said, but acting on it with individual customers -- after a driving incident, for example -- might leave a bad taste in people's mouths. "That's one thing where we may need to just see what happens elsewhere," Tyo said. "In an organization like Nationwide, I don't think trial and error is the right approach on this."
Tara Kelly, president and CEO of Splice Software, a company based in Calgary, Alberta, that uses AI to automate customer text-messaging interactions for insurance, healthcare and financial companies, said that auto telematics are turning into a big data experiment gone bad. Customers often feel they don't get enough in return for giving up so much data to insurers about their driving. It becomes less of a value-add and more of a situation out of George Orwell's dystopian novel, 1984.
"It's an example of consumer backlash," Kelly said. "In the end, tons of people got more penalties than discounts, and all of a sudden you brought Big Brother into your daily life to judge everything you do, to sell your data and above all to change your pricing."
Craig Stedman, senior executive editor of TechTarget's information management websites, contributed to this article.