Sounds like you’d just need to obfuscate the link targets (to avoid filtering) and publish links on major AI crawler targets, yeah?
The Internet was only needed to bootstrap LLMs. From this point, future iterations can be built using synthetic data and high quality sources only.