• HOME»
  • Others»
  • HEALTH DATA REGULATION IN INDIA-PART II: OBLIGATIONS AND COMPLIANCES FOR HEALTH DATA FIDUCIARIES UNDER HDMP AND PDP

HEALTH DATA REGULATION IN INDIA-PART II: OBLIGATIONS AND COMPLIANCES FOR HEALTH DATA FIDUCIARIES UNDER HDMP AND PDP

One measure (simply to drive home the point) to understand the implications of obligations under data (including health data) laws around the world is to understand the cost of compliance and non-compliance. More importantly, the ‘cost’ under compliance and noncompliance is to necessarily be understood as different, with the latter emanating from fines and penalties, […]

Advertisement
HEALTH DATA REGULATION IN INDIA-PART II: OBLIGATIONS AND COMPLIANCES FOR HEALTH DATA FIDUCIARIES UNDER HDMP AND PDP

One measure (simply to drive home the point) to understand the implications of obligations under data (including health data) laws around the world is to understand the cost of compliance and non-compliance. More importantly, the ‘cost’ under compliance and noncompliance is to necessarily be understood as different, with the latter emanating from fines and penalties, while the former constitutes activities undertaken to ensure non-compliance does not happen. This piece seeks to lay out, firstly, why health data fiduciaries (HDFs) need to comply with the HDMP and PDP; secondly, what the obligations under the proposed law (PDP) and HDMP are; and thirdly, some of the actions that could be resorted to for mitigating the compliance burden that health companies/startups.

In trying to understand the cost of compliance, comparing the anticipated rollout of PDP and the HDMP, to the rollout of the GDPR would be a good starting point. Even pre-GDPR, between 2011 and 2017, companies’ expenditures rose with companies spending more on compliance and the consequences of non-compliance. Between those years, and as of 2017, the average cost for organizations that experienced non-compliance problems was USD 14.82 million (annually), which was a 45% increase from 2011. The cost of compliance varied significantly by the organization’s industry sector, ranging from USD 7.7 million for media to more than USD 30.9 million for financial services. The percentage net increase in total compliance cost between 2011 and 2017 also varied by industry. Healthcare organizations experienced the highest growth in cost at 106%.

Civil society in India has advocated for a graded approach to scrutinize digital businesses so that startups are not overburdened with compliance costs. In a November 2020 issue brief published on the ORF website, a recommendation was made to make such scrutiny based on thresholds, with businesses that cross certain thresholds requiring to get their IT systems certified against applicable standards, with the certification remaining voluntary for others. While one can understand the force behind this sentiment, especially when it comes to data regulation (including inter alia, the costs associated to it) possibly impacting startup innovation, this may not be possible for health-tech startups, especially those dealing with certain kinds of sensitive personal health data. Health data forms a part of SPD, which requires a higher form of protection with a privacy by design focus advocated by the NDHM in its HDMP. We can see how the proposed PDP law even addresses this in various ways, with either its emphasis on the nexus between SPD and significant harm, or in the way SPD is treated wherein it requires a higher standard of consent from a DP, in order for such data to be first acquired and then processed by a DF.

There is also evidence to suggest that smaller organisations have a higher per capita cost of compliance. When adjusted by headcount (size), compliance costs are highest for organizations with fewer than 1,000 employees and smallest for organizations with 75,000 or more employees. This result may be explained in part by economy of scale, wherein larger companies have access to leading data protection technologies and highly skilled personnel who have expertise in data protection laws and regulations. Organizations with fewer than 5,000 employees have to rely on expensive external resources such as consultants and lawyers to meet compliance requirements on a global scale.

While, in the Indian context, there is no standard and reliable information available that details the economic impact of activities undertaken by companies to be compliant with our privacy / data protection laws, information accessed from privacy solution provider – datagrail.io, breaks down, very lucidly, the factors that affect the continuous cost of compliance. A practically observed premise is that the cost of compliance cannot be measured in money alone: it must also include the operational expenses of human resources and time. The report goes on to argue that measurements of the impact must extend beyond the initial cost of preparation to examine sustained compliance. Companies spent hundreds of thousands – even millions – of dollars on compliance solutions, but they continue spending thousands of hours manually managing compliance at the risk of introducing human error. While preparing to comply with the GDPR involves a set of activities such as data inventory and mapping, establishing a workflow (whether automated or manual) for processing subject access requests (SARs), implementing consent management, and updating privacy policies. Sustaining that compliance, however, involves further activities such as continually updating the data map with new fields and business systems, communicating process changes with all employees involved, producing robust compliance logs, and staying current with regulation updates or changes – in addition to processing SARs that come through. Decision makers report spending virtually the same amount of time working to sustain compliance as they did to prepare: this is the cost of continuous compliance.

In a survey taken to assess the impact of the GDPR on digital companies it was noted that while there were a number of figures reported, the data on SARs is one of the noteworthy takeaways we can use to illustrate the point being made here. SARs can be understood as declarations of intent by DPs to enforce their rights (which includes the right to confirmation and access, right to correction and erasure, right to data portability, right to be forgotten/stop continued disclosure) on their data. The PDP in its current form provides DFs a 30 day window to adequately respond to each SAR, which is a standard across laws and jurisdictions, but the US is going one step further and the OCR in the Department of Health and Human Services is implementing a shorter 15 day window for such response. 30 days may ‘seem’ like a lot, but it really isn’t. In a survey by datagrail.io, more than half (58%) of companies were receiving 11+ SARs per month and 28% were receiving 100+ per month (as of 2019), and additionally more than half of companies (58%) have at least 26 employees managing these requests. The survey went on to state that conservatively extrapolating, thousands of emails or alerts were sent to manage SARs in a year (2018-19), increasing the magnitude of risk with each touch point. SARs are a key requirement under privacy laws such as GDPR and CCPA, and are seemingly being provided the same legislative and regulatory importance under the PDP and HDMP. Much of the costs of compliance can be attributed to having to gather, collate and redact information manually, which is why it would make sense for startups to invest in automated solutions to handle such requests from the start. On an analysis of a per unit (per SAR) cost, it was found that companies in the UK spend about GBP 10 in complying with an SAR and even then it was found that this cost rarely covers the cost of complying with a SAR, particularly where the request is complex and collating the information is especially time consuming.

The proposed law under the PDP, and HDMP places obligations on DFs very similar to those under the GDPR. The obligations are ever present from the time a HDF gains access to data on individuals. They cover a wide variety of mandates from notice requirements to ensuring appropriate means of consent acquisition to ensuring principles of purpose limitation and specificity are maintained when data is being acquired, and ensuring the same during the time data is processed. For the purpose of this article however, I will focus on the obligations that are borne out of the privacy principles enshrined under both the PDP and HDMP.

I will focus on the privacy principles of ‘privacy by design’ and transparency, and while there are other principles of maintenance of records and having security safeguards, they can be viewed as necessary by-products of the highlighted principles.

Every startup would need to implement a privacy by design policy (PDPOL). This involves not only having a privacy policy (and publishing the same on its website), but also a PDPOL (again as per the HDMP to be published on its website). However, the PDPOL is more than just a document. It’s a representation of a process and system a company has to actively think about and spend on, to put in place. It involves the organisational and technical systems designed to anticipate, identify and avoid harm to the DP and requires adoption of commercially acceptable technology used in the processing of PD. The federated architecture of the NDHM makes it clear that data will be stored at the nearest point of care for an individual, meaning that healthcare institutions and / or companies that provide digital health solutions are to be primarily responsible for storing the data, as processors and fiduciaries. The PDPOL under the HDMP requires that a HDF links its practices to the data protection principles of data minimisation and purpose limitation.

Practically, to comply with the requirement, the first step in this process is bringing all key stakeholders of the organisation together, including from legal, security IT and engineering to product and market teams of the company. This is done to evaluate how they are working with sensitive data. The next priority is exploring the details of how the data should be stored, and businesses would need to balance data protection and function, i.e. data handling must be safe enough to consistently shield privacy, but not render the information entirely inaccessible. This would mean that there would need to be clarity on how data is stored, how it is backed up and how it is transferred within and outside the organisation. Also as part of the PDPOL, there needs to be a framework of select authorisation where there needs to be risk limited by granting access to data only to specific individuals, which depends on the function they perform within the company. The 2nd framework is data encryption, which is focussed on obfuscating data to guard against any unauthorised access. Encryption takes many forms. For example, data can be encrypted at different points in line with when risk is greatest; during storage, transmission, or access — even if permitted. It can also be continuously encrypted across its lifecycle. Offering a higher level of security, the main downside is longer processing to convert and extract information. The 3rd framework is masked storage, which as the name implies is about changing the appearance of data; generally replacing attributes such as numerals while retaining format and length. In doing so, it masks personally identifying elements, but preserves data value. Again, decisions about how and when to apply masking can be made in line with whether companies want to protect certain areas or all data processes. The 4th and final framework suggested is tokenised insights which, much like masking, means the swapping of sensitive data elements for non-sensitive equivalents, or tokens. The key difference is that full use of these tokens is exclusive to a secure tokenisation system. Tokens can’t be mapped back to data unless companies or approved individuals have access to the original tokenisation tool. Broadly speaking, this is one of the more robust methods and also comes with the additional benefit of flexibility, where data fields can be turned into tokens and back.

The privacy principle relating to transparency under the HDMP, will require the HDF to enable a DP to gain, withdraw, review and manage his consent through an accessible, transparent and interoperable platform. Reading the transparency obligation in the PDP along with the HDMP requires the HDF to notify the DP from time to time, the important operations of processing of any PD related to the DP, and such information needs to be provided in an intelligible form, using clear and plain language. Conformity in terms of documentation would mean having a comprehensive documented information security programme and information security policy that contain managerial, technical, operational and physical security control measures. In other countries with developed privacy laws, the privacy principle of transparency is closely associated with interoperability. As per the NDHM, interoperability is a key feature of the digital health architecture with the NHA pushing for adoption of the FHIR interoperability standards. So transparency will (to an extent) be built in to the interoperability framework. However, there is a limitation that is built into the FHIR standards. Now, the FHIR is an interoperable standard that is built on ‘Resources’ which could be understood as paper “forms” reflecting different types of clinical and administrative information that can be captured and shared. However apps and digital health representations of services / products may be built on ‘custom’ resources that may be integral to the digital health service / product offered, which is considered as non-conforming with the FHIR standard, and such custom resources would have to necessarily adopt the transparency principle to the internal code of systems that support a particular resource part of the FHIR standard. In the US, with the advent of the imposition of interoperability by the healthcare administration there, digital care providers are facing issues and require clarifications with regards to implications of interoperability requirements. Some of these relate to the need of patient data needing to be secure – whether in open application programming interfaces or patient facing apps; certain information – such as competitive prices negotiated between health plans and providers – should remain proprietary. Currently, no FHIR implementation guide exists to standardize the method of requesting and exchanging cost transparency information at the point of decision making.

To enable the adoption of the privacy principles above, as required as part of the obligations that need to be carried out by HDFs, the HDMP itself suggests solutions (as part of obligations to be carried out) that assist in ensuring conformity with the regulations. One of these is the maintenance of records, which include maintaining details of ecosystem partners, purposes of processing, description of the categories of DPs, description of the categories of PD and SPD, categories of recipients to whom the PD/SPD is disclosed or transferred including to data processors and geographies of recipients. A second privacy principle conformity method is conducting Data Protection Impact Assessments (DPIAs). HDFs are to carry out DPIAs before they undertake any processing involving new technologies or any other processing which carries a risk of significant harm to data principals. Significant harm has a nexus with health data generally, and therefore, while not clear in its required implementation HDFs would need to carry out routine DPIAs. DPIAs are to contain a detailed description of the proposed processing operation, the purpose of processing, nature of personal data being processed, assessment of the potential harm and measures for managing, minimising, mitigating or removing such risk of harm. Finally, before any activity that can cause a significant data sharing event (like a merger between two healthcare providers or entities, or an significant acquisition / investment event), a Data Audit will have to be carried out by a practicing and independent information security professional, which will cover, inter alia, written procedures and training imparted to all employees of a healthcare organisation, review of the work done and responsibilities carried out by the entity data protection officer (again a mandate of the PDP and HDMP), the review of privacy policies and notices (for obtaining consent) maintained by the organisation, records of all data subject rights and SARs, records and review of dealings with the HDFs with their data processors, DPIAs conducted, data breach events, information asset registers, business contracts, media disposals and data sharing processes generally. A Data Audit will eventually need to provide and certify the audit findings as either being compliant with privacy law compliances or not. This process can take up to 06 months, with the potential to impact a beneficial commercial event for a HDF, and therefore should not be relied on as a last resort to conformity with the PDP and / or the HDMP.

While the obligations and solutions as referred to above have been alluded to in the HDMP and PDP, it is expected that the data authority in consonance with the NHA will provide detailed regulations and guidelines for the steps to be taken by HDFs, in more detail.

Divyam is a lawyer and founder of consultancy ‘The Narrative Counsel’

Tags:

Advertisement