The report focuses on the promise of information technology for health treatment and research, noting the considerable investment in the area by industry and the federal government (the American Recovery and Reinvestment Act of 2009 alone provides nearly $36 billion for health information technology). The real thrust of the report, however, is the lack of uptake – “Despite this great promise, the impact of IT on healthcare over the past decade has so far been modest”– and recommendations for what to do about it.
To achieve the promise of health IT, the report recommends moving from a system oriented around “records” to one oriented around individual “data elements:”
“…the best way to manage and store data for advanced data-analytical techniques is to break data down into the smallest individual pieces that make sense to exchange or aggregate. These individual pieces are called “tagged data elements,” because each unit of data is accompanied by a mandatory “metadata tag” that describes the attributes, provenance, and required security protections of the data.”
Tagged data elements could then be shared across institutions through a “universal exchange language,” that the federal government would create standards for and that industry would develop and deploy.
Based on these recommendations alone, the report would be significant, because if implemented, these recommendations would change the structure of IT in the largest industrial sector in the United States.
But the report goes on to address privacy explicitly and in apparently contradictory ways.
On the one hand, the report stresses the need to enact “strong, persistent, privacy safeguards.” According to the report, consent must be part of those safeguards. “An individual’s right to have some meaningful choice in how their information is shared is one important component of a comprehensive set of protections. Where such choices are provided, either in law or by policy, they must be persistently honored.”
The report repeatedly lauds the value of data tagging as a way of recording “patient privacy preferences” that could then control subsequent use of the tagged data elements. “An exchange language based on tagged data elements allows for privacy rules and policies to be more effectively implemented; it also allows for more finer grained individual privacy preferences to be more persistently honored.”
Then, on the following page, the report recommends moving beyond the HIPAA Privacy Rule to free up health research from the bureaucratic burdens of patient choice. Citing to a 2009 report by the Institute of Medicine that recommended minimizing the role of individual consent for the use of protected health information in health research, the PCAST report urges that the Privacy Rule “should be reformulated so that they ensure both patient privacy and patient benefit from medical research, in a world where medical data are increasingly in electronic form and where there is a growing need for real-time or near-real-time aggregated data to improve healthcare.”
The irony could hardly be more stark. The Institute of Medicine report to which PCAST cites, concluded that “a universal requirement for informed consent would impede important health research and lead to biased, ungeneralizable results, to the detriment of society.” Meanwhile, “[t]he Privacy Rule, as currently defined and operationalized in practice, does not provide effective privacy safeguards for information-based research because of an over-reliance on informed consent, rather than comprehensive privacy protections.”
In short, within two pages, the PCAST report recommends both amending HIPAA to permit greater access to protected health information to facilitate research, and implementing data-tagging that would allow each individual to impose his or her own unique conditions on the terms under which access might be allowed.
If the PCAST report is implemented in a way that focuses more on its recommendation that the Privacy Rule be revised to focus less on consent in an effort to facilitate effective, affordable and timely health treatment and research, this would be consistent with a recent series of government reports, including the Federal Trade Commission’s privacy report, “Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers,” and the Department of Commerce’s Green Paper, “Commercial Data Privacy and Innovation in the Internet Economy: A Dynamic Policy Framework.” And the Institute of Medicine also clearly recognizes the limits of choice as a tool for protecting privacy. As stated in the Institute’s report, “consent (authorization) itself cannot achieve the separate aim of privacy protection” because “obligations to safeguard privacy, such as security, transparency, and accountability, are independent of patient consent. In fact, preventing the secondary use of personal data is the only privacy obligation that consent can potentially address.”
If, on the other hand, the PCAST report is interpreted to focus more on the recommendation that granular privacy preferences be coded in data tags that would guide the use of individual data elements indefinitely, this would run contrary to these reports and significantly change the way U.S. policy approaches privacy protection.
Moreover, the experience with notice and choice to date suggests that basing privacy protection on “more finer grained individual privacy preferences” that would be “more persistently honored” would be unworkable. In the face of the extraordinary proliferation of health-related information, it seems unrealistic to expect that individuals, who already overwhelmingly ignore privacy notices and choice opportunities, are going to have the time, expertise or interest to express even more detailed preferences.
The report is now in the hands of the National Coordinator, who has the responsibility for implementing it.