New Problem for AI Is State Regulation

If federal regulation doesn’t occur, states will take control, making it increasingly difficult to be compliant.

Whether wrong answers from search, bias undercutting marketing, or fraud, there are problems connected to generative artificial intelligence. That isn’t surprising. Any technology, tool, process, or system has innate weaknesses or complexities.

There are different ways of addressing weaknesses — modifications of tools, limitations on how to use a given technology, workarounds, and the like. One other is regulation. Governments can demand standards of operations, safeguards, modifications, limitations, disclosures, or what have you.

And this is why, when regulation seems to be in the air, many companies push for national regulations. The need for some compliance may be unavoidable. Better to be done once and then applicable everywhere rather than a mix of different requirements in varying geographic regions.

But that doesn’t seem to be the way regulation over AI is developing. There is talk of national requirements or limitations. Unfortunately, that hasn’t happened. Instead, it’s a bit of a mess. As The Hill reported in one of its newsletters.

“State legislators have introduced nearly 650 bills relating to AI in 45 states, according to MultiState, a government affairs firm that tracks AI legislation. Already, 55 of those bills have passed,” they wrote. “Those bills focus largely on preventing algorithmic discrimination, disclosing when generative AI is used, and barring deepfakes. Lawmakers are also considering how ChatGPT might be used in schools or government procedures. And more is coming next year.”

No matter how much of an advantage using generative AI might seem, this is an enormous hurdle and one that could be next to impossible to clear, depending on where a CRE company operates, without a great deal more information. In theory, a company could make a reasonable effort to anticipate what one given legislature might do and take every effort only to find that it did the wrong thing. Or that efforts to be ready for a law in one state could put what you are doing in direct conflict with the directions of another.