The SANDBOX Act, if passed by Congress, would allow companies to apply for modifications to or waivers from any "obstructive regulations" to the testing and deployment of products or services that use or contain "in whole or in part" at least one AI system. In return, companies would be required to disclose plans to mitigate consumer safety and financial risks.
On the right, at the Westin DC Downtown: NatCon, a gathering of Trump officials and allies calling for the persecution of AI developers and the expulsion of the "insufficiently American." On the left, at the Salamander Hotel on the waterfront: the Abundance Conference, whose annoyingly optimistic proponents envision an American techno-utopia, which could be realized if governments just stopped regulating so damn much.
Hawley said he planned to introduce a bill on the topic "soon," but declined to offer further details. A self-styled populist, Hawley staked out his stance against autonomous vehicles in a speech at the National Conservatism last week, where he cast AI as part of a transhumanist project and called for a series of restrictions on the technology. "Only humans should drive cars and trucks," Hawley said at the time.
Plenty of lousy votes were taken during the summer's Congressional sessions, when President Trump's omnibus "Big Beautiful Bill" eventually passed after numerous senators and House members obtained their various pounds of flesh from it. Trump gave concessions to senators from Alaska, Wyoming and many other states to win continued tax cuts for billionaires plus massive slashes in Medicaid and funds for rural hospitals.
The TUC's Building a pro-worker AI innovation strategy paper warns that short-term priorities driven by the UK's corporate governance system mean AI may be used by some employers to cut costs and automate existing processes, rather than invest, expand and innovate. "Such decisions will more likely displace or deskill workers rather than augment, expand or retrain the workforce as part of technological upgrading," said the TUC.
While the lack of federal regulation in the U.S. presents a loose framework for AI, it contrasts with Europe’s stricter data and privacy laws, increasing calls for transparency and safety.
Europe is heading down the wrong path on AI. This code introduces a number of legal uncertainties for model developers, as well as measures which go far beyond the scope of the AI Act.
Henna Virkkunen, the EU Commission's Executive Vice-President for Tech Sovereignty, argued that the AI Liability Directive would have led to fragmented rules across EU member states.
The AI Infrastructure directive addresses energy and permitting issues associated with data centers and the computational demands of running AI applications, requiring the Department of Energy to issue RFPs.
Every product leader used to brag about how quickly they could ship their product. However, with the rise of new regulations, today's top PMs brag about their ability to ship fast while also showing their work, dataset lineage, bias tests, and audit hooks before any code reaches production.
The part of the bill of most interest to the tech industry was its provision for a moratorium on state-level regulation of AI, which would have rendered all AI regulation that didn't come from the federal government unenforceable for 10 years.
"State and local governments should have the right to protect their residents against harmful technology and hold the companies responsible to account," said Jonathan Walter, a senior policy adviser at The Leadership Conference's Center for Civil Rights.
"While I appreciate Chairman Cruz's efforts to find acceptable language that allows states to protect their citizens from the abuses of AI, the current language is not acceptable to those who need these protections the most."