MPs call for stricter AI regulations in the UK

MPs call for stricter AI regulations in the UK

More than 100 UK lawmakers have urged tighter AI regulation. The cross-party group, backed by Control AI and high-profile figures, warns that unrestrained artificial intelligence could rival nuclear arms in its threat potential, calling on the government to introduce enforceable safeguards against corporate acceleration.


Over 100 UK lawmakers have called on the government to enforce tougher rules on artificial intelligence, cautioning that ministers might allow the technology to progress without restraint. The cross-party movement, spearheaded by prominent figures such as former defence secretary Lord Des Browne and ex-environment minister Lord Zac Goldsmith, contends that superintelligent AI could potentially emerge as “the most dangerous technological advancement since nuclear arms.”

The initiative, led by Control AI, which counts Skype co-founder Jaan Tallinn among its backers, is urging Prime Minister Sir Keir Starmer to resist U.S. influence, where the White House has historically been against strict AI regulations. Lord Goldsmith remarked, “While high-ranking individuals in AI are raising alarms, governments are lagging far behind the corporations.”

This cautionary note coincides with messages from Silicon Valley specialists that the risks are rising at an alarming pace. Jared Kaplan, chief scientist at Anthropic, suggested that by 2030, humanity might confront a crucial decision regarding permitting AI systems to independently enhance their capabilities.

In 2023, the UK convened a safety summit, creating a security institute and underscoring the potential for “serious, even catastrophic harm” from advanced AI. Nevertheless, the UK has since drawn criticism for its lack of global collaboration or enforceable regulations. The government’s strategy has depended on existing sector regulators rather than definitive legal frameworks. The Department for Science, Innovation and Technology (DSIT) asserts that the technology is already under regulation, claiming its framework can adapt to technological evolution. Critics argue, however, that this approach leaves the country lagging behind AI firms that continue to develop cutting-edge models with scant oversight.

Former AI minister Jonathan Berry has pushed for enforceable global regulations with “tripwires” for extremely powerful models, ensuring they undergo testing and come with shutdown mechanisms. Andrea Miotti, CEO of Control AI, stated, “AI firms are pressuring governments to delay regulation, arguing it would stifle innovation – from some of the same companies that assert AI could endanger humanity. It’s quite pressing.”

While this conversation continues, the rapid adoption of AI by businesses persists. Research conducted by consultancy Elixir indicates that nearly half of UK companies are making substantial investments in AI, with early adopters experiencing notable cost reductions.



  • Hargreaves Lansdown services resume after glitch

    Hargreaves Lansdown services resume after glitch

    Hargreaves Lansdown resolves technical issues disrupting customer access. The UK’s largest DIY investment platform confirmed that clients can now access their accounts following a technical problem that disrupted services amid market volatility. The platform assured customers their assets and data remained secure.


  • Retirement gaps create business planning risk

    Retirement gaps create business planning risk

    Most workers are off course for retirement, Flagstone survey finds. The savings platform said the mismatch between planned and likely retirement ages is becoming a workforce-cost issue for employers, not just a personal finance problem.


  • Corelight pushes agentic AI into SOCs

    Corelight pushes agentic AI into SOCs

    Corelight is targeting repetitive security triage with transparent AI tools. The cybersecurity company said new agentic workflows can speed investigations while preserving an auditable trail of playbooks, evidence, and analyst actions.