News and Commentary Impacting Data Privacy and Cybersecurity Programs

What Could Have Been: Artificial Intelligence Bill Dies as CT Legislative Session Ends

Connecticut Senate Bill No. 2, “An Act Concerning Artificial Intelligence,” got through the house but was not passed before the 2024 legislative session closed on Wednesday, May 8, 2024. The bill would have instituted several rules regarding the use of artificial intelligence, including its use in election campaigns and a prohibition on revenge porn generated by using AI. However, critics worried that the legislation, which included reporting requirements for impacted organizations, would have discouraged innovation and the growth of AI-related business. Governor Ned Lamont had expressed concerns about the bill’s effect on the business climate of the state and signaled he would not have signed it into law. Colorado’s governor expressed but ultimately ignored similar “reservations” when he signed the Colorado AI Act on May 17, 2024, which is substantially similar to the bill that Connecticut failed to pass. 

HIPAA Privacy Rule Addresses Reproductive Health Information

On April 22, 2024, the Office for Civil Rights (OCR) at the U.S. Department of Health and Human Services (HHS) announced a Final Rule on reproductive health privacy. The HIPAA Privacy Rule to Support Reproductive Health Care Privacy, effective June 25, 2024, prohibits the disclosure of protected health information (PHI) related to lawful reproductive health care in certain circumstances. With Roe v. Wade overturned and twenty-one states banning or restricting abortion procedures, the Rule is meant to protect patient confidentiality and prevent medical records from being used against people for providing or obtaining lawful reproductive health care. The covered entities for the final rule include most health care providers, health plans, health care clearinghouses, and business associates (collectively, “regulated entities”).

Key Provisions of HIPAA Privacy Rule to Support Reproductive Health Care Privacy

Disclosure Protections: Prohibits the use or disclosure of PHI when it is sought to investigate or impose liability on individuals, health care providers, or others who seek, obtain, provide, or facilitate lawful reproductive health care.

Signed Attestation Requirements: Requires regulated entities to obtain a signed attestation that certain requests for PHI potentially related to reproductive health care are not for the prohibited purposes.

Updates to Notice of Privacy Practices: Requires regulated health care providers, health plans, and clearinghouses to modify their Notice of Privacy Practices to support reproductive health care privacy.

No. 17: Understanding the Nebraska Data Privacy Act

Nebraska has become the 17th state to enact a comprehensive consumer data privacy law. The Nebraska Data Privacy Act takes effect January 1, 2025. Like its 16 predecessors, the NDPA establishes fundamental rights for Nebraska residents to exercise control over their personal data and obligates businesses, among other things, to disclose their personal data practices and to provide mechanisms for opting out of the sale of data, targeted advertising, and automated decision making based on user profiles. The NDPA joins those states expressly denying a private right of action, leaving enforcement strictly to the Attorney General.

The NDPA stands out in at least two respects. Unlike other state privacy laws that condition applicability on gross revenue amounts or volume of transactions, the NDPA applies to the personal data processing of even one resident. However, it exempts (with one narrow exception) “small businesses” as defined by federal statute. Small businesses otherwise exempt from the NDPA must not sell “sensitive” personal data without first obtaining consumer consent. 

Concluding Consideration

While Connecticut ultimately declined to lead the way in regulating the business of AI, Colorado has forged ahead. We expect other states to follow. Moreover, a majority state consumer privacy laws already regulate AI, albeit indirectly, to the extent that they require disclosures and opt-out mechanisms around the use of automated or algorithmic decision making solutions that process personal data. 

Proactive organizations and risk management programs should begin crafting policies around data privacy and the implementation of AI solutions in order to avoid legal grey areas in the future about privacy violations and consumer harm. Appropriate measures might include:

  • Creating an established approval system for the use of personal consumer/employee data and AI in the workplace, including ramifications for employees who bring AI to work in secret.
  • Drafting appropriate contractual provisions to assign risk for when solutions may infringe on consumer privacy.
  • Training, awareness, and transparent conversations with employees around why these initiatives matter and addressing their concerns.

For further information or guidance on these issues, please contact:

Sherwin M. Yoder, CIPP/US, CIPP/E and CIPM
Partner
203.575.2649
syoder@carmodylaw.com

This information is for educational purposes only to provide general information and a general understanding of the law. It does not constitute legal advice and does not establish any attorney-client relationship.