A few months ago, Ian Hogarth wrote the Financial Times Op-Ed headlined “We must slow down the race to God-like AI.”
A few weeks ago, he was appointed head of the UK Foundation Model Taskforce, and given 100 million pounds to dedicate to AI safety, to universal acclaim. Soon there will also be a UK Global AI Summit.
He wrote an op-ed in The Times asking everyone for their help, with accompanying Twitter thread. Based on a combination of sources, I am confident that this effort has strong backing for the time being although that is always fragile, and that it is aimed squarely at the real target of extinction risk from AI, with a strong understanding of what it would mean to have an impact on that.
Once again: The real work begins now.
The UK Taskforce will need many things in order to succeed. It will face opposition within and outside the government, and internationally. There is a narrow window until the AI summit to hit the ground running and establish capability and credibility.
The taskforce represents a startup government mindset that makes me optimistic, and that seems like the best hope for making government get things done again, including on other vital [...]
---
First published:
July 10th, 2023
Source:
https://www.lesswrong.com/posts/xgXcZQd5eqMqpAw3i/consider-joining-the-uk-foundation-model-taskforce
Narrated by TYPE III AUDIO.