This policy has been created with the intent to protect patient health information and proprietary data from unauthorized releases.
According to several class actions lawsuits against Amazonโs virtual assistant Alexa, it has been found that these devices frequently capture conversations by accident without being triggered by the โwake word.โ
While these devices are to work only after hearing the โwake-wordsโ, research regarding this topic has discovered that more than 1,000 sequences of words may incorrectly trigger smart speakers, such as Alexa. Furthermore, it has been revealed that after a user speaks to an Alexa device, Amazon collects, captures, and stores voiceprints of the user, and transcriptions of the voiceprints, and it does not delete the voiceprint, or the transcription created by that request.
In summary, the current structure of most virtual assistants and smart speakerโs architecture doesnโt align with HIPAA restrictions, particularly in terms of access of personal health information (PHI).
Virtual assistants have the potential to violate patient privacy as well as State and Federal laws; hence, it is the intent of this policy to ensure that no information is divulged without the authorization or consent of its owners.