We’ve been doing quite a bit of work behind the scenes on Canalplan, a lot of it has been related to caching data for search engines.
Canalplan gets a LOT of hits from the likes of Google, Bing etc. and we were using quite a lot of resources delivering pages to them. So we put together a system so that search engines (and only search engines) are served a static version of the page from a stored HTML file. If there isn’t a static file when the search engine requests the page then its generated for them. When data for a page changes the stored page is deleted from the cache.
As it takes less resources to serve a static HTML file as pache can deliver it directly and we don’t need to spin up a canalplan process it means that everything runs faster for the search engines AND those precious resources can be used to deliver data to real users.
We also implemented a similar process for the boat index which saves executing the php and making database connections for each cached page.