Demanding privacy, and establishing trust, in digital health

Februarys Wall Street Journal report pulled back the curtain on just how much is at stake when individuals share their personal health information with health and fitness applications.

But the dangers in digital health arent limited to rogue SDKs; three days after the Facebook news broke, yet another large health system announced the personal information of more than 326,00 patients had been exposed. All this comes as big tech companies like Apple, IBM and Amazon begin to enter the same space, with plans for huge impact.

Companies operating under the highest standards in healthcare are expressly prohibited from monetizing users data; how will large tech brand names adapt their business models to act properly?

In order for the promise of digital health to be realized, companies will need to ensure their patients data is safe, secure and error-free. Beyond security, healthcare companies operating as providers must also maintain the confidentiality and privacy of that data.

There is a baseline expectation of privacy being maintained.

The success of digital health companies will hinge on whether patients feel comfortable sharing the most intimate data they possess especially when they worry that data could impact their employment.

Federal laws and regulations prescribe privacy and security minimums, as well as the exact rules governing collection, storage and transfer of participant data.

By implementing a privacy and compliance program, youll be better equipped to find and correct potential vulnerabilities, while reducing the chance of fraud, and promoting safe and quality care.

Original article