Blog

Who’s afraid of GDPR?

Privacy didn’t feature in any aspect of law until the latter half of the twentieth century.  From the original census called by Herod, the idea that the powers that be were entitled to the maximum amount of data was ingrained in the general population.  If asked, people would provide any amount of information that was requested in order to prove who they were or where they lived or what their income was.

This largess was extended to large corporations too.  Hardly anyone queried the long forms that came as part of buying or hiring anything, whether it was in order to purchase an insurance policy, get a bank loan or even just to hire a car.  It was as if those who were selling things or providing services were doing you a favour, so one had to go along with whatever conditions those businesses decided were right and proper.

 

The era of personal privacy

Then the concept of personal privacy came along.  It’s not one that would be immediately apparent to someone who lived between the war years.  Then, privacy was about ensuring that your secrets were kept safe from neighbours or work colleagues.  The assumption was that public bodies and businesses had a right to know huge amounts about you, if you had consented to deal with them.

Perhaps the reason no one really worried about it before, was because the effort involved in keeping tabs on the minutiae of every person’s life and opinions was far more effort than it was worth. Knowing peoples’ movements for example meant appointing someone to follow them around, as was sometimes done for big insurance claims, in order to catch photographs of those who claimed to be almost crippled by an accident, out running or bouncing on a trampoline.  This was a very expensive approach, and so was only used where the company was relatively sure there would be a big payback.

Now, technology has changed that equation.  Gathering information about people has become so much easier over the last few decades and powerful computers have given the ability to search for information on a specific individual – the traditional needle in a haystack – or to crunch the numbers to derive behavioural insights from the mass of data that is available.  Combined with the fact that so many people are putting the information out there themselves via social media and the job has become so much easier; the information doesn’t have to be gathered, it just has to be filtered out from the mass of information that already exists.

Cameras on streets can decide to change traffic light at different rates in order to optimise the flow of cars, based on the volumes they can see coming to a junction from different directions.  Or they can identify an individual driving too fast and trigger an automatic fine with photographic evidence to back it up.  All without human intervention.  But it does mean that the cameras are gathering information on us and our whereabouts can thus be identified without us having any say in the matter.  If people get access to this data, then they can plot our movements throughout the day.

It’s even worse with the smartphones we all carry.  Of course, from the standard phone signals, triangulation between masts ensures that we can be positioned roughly.  But our constant use of Wi-Fi in public places and businesses such as cafes, restaurants, and train stations mean that our actual position can be found out quite easily.  We are constantly giving this information out to machines which, like elephants, never forget.

 

What’s the problem?

If people are so willingly giving away so much information about their own life and the choices they are making, why is there so much drama going on about data privacy?  Why are we ending up with ludicrous situations such as when my wife tried to open an account in a bank recently for a charity, she was compelled to fill out completely new forms and provide ID again, even though, she already had an account with the bank.  They claimed that under GDPR they couldn’t share her own information with herself.

There are many incidents like this e.g. the Irish Office of Public Works (like the National Trust) has withdrawn all visitors books from their historic sites in case anyone photographs the names and addresses some visitors write into it.  The fact that the visitors wrote their address and comments in deliberately to be seen doesn’t seem to matter.

The point is that when people decide to put information into the public domain, it is their decision.  If they send information to a company, they are knowingly supplying that information.  They do not expect it to be abused, by being leaked out or sold to another company.  But they are engaging with the company and will expect quite several people to have access to it.  So, why are so many companies and institutions so worked up about the extremely unlikely possibility that they will be held responsible for simply using the data when the customer has voluntarily submitted it?

Failure costs

The reason is that the scale of the fines imposed for GDPR breaches is at such a level – a maximum of 4% of global turnover – that it has firms very afraid indeed.  Indeed, it would not be an understatement to call it paranoia.  As a result, people are making life much more difficult for themselves.  Every piece of data that might fall into the ‘personal’ data category is being treated as if it is radioactive.  And yet, all this data is willingly being submitted by customers because they are looking for the organisation to provide them with services that must be related to the individual – an approach which is impossible if all information about the individual held is to be private, and therefore unusable.

The financial services sector is, by its very nature, extremely prone to this.  Banks, insurers and other financial institutions can be rendered paralysed by the scale of the amount of personal financial information they are already holding and the fear that they could be accused of doing things with it that they shouldn’t. Internal controls can become excessive – I was recently informed of an insurance firm where marketing individuals were being stopped from seeing lists of data coming in that they were to respond to because of the personal email addresses involved.  Yet, the customer submitting the information didn’t imagine they were submitting it to just one person in the organisation, the employee had a valid business reason for seeing the data and all employees should be bound to keep all information confidential anyway by virtue of the employment contract they signed.

 

Customers are smart

Some of this is also patronising to the customers.  They do not expect that they can get highly personalised services without putting in a lot of information. They are not so gullible to believe that the company can simultaneously both know enough about them to provide the correct recommendations and products to suit them, and yet not have a lot of personal data about them.

This is the true irony of the current situation; people are said to be demanding huge levels of data privacy, and yet at the same time demanding highly personalised services.

The biggest fines to date have not been to financial institutions; they have been to the airline, British Airways, hotel group Marriott’s and the social networking site Facebook.  Although all three fines were for data protection failures, British Airways fine (£183M) and Marriott’s fine (£99M) were for cyber security breaches – where the data was hacked.  This was down to their failure to protect the data from external attack and, while it calls for far more rigorous approaches to the technical security of their data, it had nothing to do with the amount of data they were gathering nor with whom was carrying out the processing within the organisation.

All companies need to regard their data storage like they would regard a bank vault – something to be protected at all costs against the fact that the contents are extremely valuable, and therefore attractive to hackers and thieves.  These failures would have been huge problems for them even if GDPR had not existed, although it is probable that the fines would have been lower.

Only Facebook (£44M) was fined for a purely ‘GDPR’ based situation – the failure to provide users with transparent and understandable information on its data use policies.  This is undoubtedly something that wouldn’t have happened before, although it is questionable whether the scale of the fine would have been the same with any other firm.

 

The enemy without

The key point is that data protection is primarily common sense.  If the customer is seeking a product or service, then clearly, they must give the company some personal data.  In the case of life and pension companies, this is likely to be quite a lot of highly sensitive data, given the nature of life and pension products. It is not wrong to have it, but it must be clearly secured against unauthorised use.

This however does not mean that one needs to be incredibly paranoid within the organisation – all employees will be bound to keep the company’s information private by virtue of their contract with the company.  It means most of all focusing on how to keep the data secure from outsiders and to ensure that the correct processes and policies are implemented internally to prevent the data leaking from the inside.

Cyber security is where the big focus should be.  The most pressure should be on those who manage the IT infrastructure in order to prevent outsiders being able to gain unauthorised access.  Secondly, staff must be fully data aware to co-operate with this.  This means regular training and testing of employees’ ability to recognise these threats, as one of the more common ways hackers will attempt to effect this is to try to get a member of staff to be an innocent accomplice by phishing them or using some type of email scam in order to implant trojan software onto the system, which can help the external hacker to gain access.

While huge care is needed, and no one can say that data protection shouldn’t be a top priority, the paranoia that is affecting many companies should stop.  The client has willingly given the data, and therefore the key is to use it as the client would expect i.e. to provide the products or services that the client is looking for.

 

Use the asset

Data protection should be seen as a positive by life and pension companies.  The way you protect the data of your customers can be a key part of one’s strategy, given that the basis on which people entrust their money or protection to a life and pension companies is one of trust… that the company will deliver when the unforeseen happens.

Instead of panicking over the possibility of an internal process going wrong, the company is better off focusing on how to ensure that the data is protected from outside attack and that internally a full audit trail is in place in order to spot any issues with rogue employees who would be in breach of their contract if they started to leak information that they were not entitled to. Once these protections are in place and constantly monitored, the data should be used as the asset it is, in order to inform the company of customer behaviour and to hone their products and services.  Use the asset, don’t fear it.

About the author

Author Denise Garth

Denise Garth is Chief Strategy Officer responsible for leading marketing, industry relations and innovation in support of Majesco’s client centric strategy, working closely with Majesco customers, partners and the industry.