It could. It’s in their best interest to know how to make it either 1) enforceable, which is hard; or 2) enforce that companies development and deploying high risk systems dedicate enough resources and funding to research on effectively circumventing this challenge.
Lawyer me says it’s a wonderful consultancy opportunity for people who have spent years on this issue and actually have a methodology worth exploring and funding. The opportunity to make this provision more specific was missed (the AI act is now fully in force) but there will be future guidances and directives. Which means funding opportunities that hopefully make big tech direct more resources to research. But this only happens if we can make policy makers understand what works, the current state of affairs of the shutdown problem, and how to steer companies in the right direction.
(Thanks for your engagement here and on LinkedIn, much appreciated 🙏🏻).
Ah, I see! I agree it could be more specific.
It could. It’s in their best interest to know how to make it either 1) enforceable, which is hard; or 2) enforce that companies development and deploying high risk systems dedicate enough resources and funding to research on effectively circumventing this challenge.
Lawyer me says it’s a wonderful consultancy opportunity for people who have spent years on this issue and actually have a methodology worth exploring and funding. The opportunity to make this provision more specific was missed (the AI act is now fully in force) but there will be future guidances and directives. Which means funding opportunities that hopefully make big tech direct more resources to research. But this only happens if we can make policy makers understand what works, the current state of affairs of the shutdown problem, and how to steer companies in the right direction.
(Thanks for your engagement here and on LinkedIn, much appreciated 🙏🏻).