This is a tricky problem in general, and specific, partial solutions abound. Especially unidirectional solutions. For example, you can get read-only versions of wikipedia for offline use in your remote mountain village; but there is no easy way to contribute your updates back to the version on the main internet.
Also you should have the internet cached for offline use even if the net is nice right now, because nation states are war gaming to destroy the internet, and us little people will suffer when that happens and we can’t get our Youtube instructional videos on how to survive the apocalypse after it happens.
Offline automatic filesync
ArchiveBox takes a list of website URLs you want to archive, and creates a local, static, browsable HTML clone of the content from those websites (it saves HTML, JS, media files, PDFs, images and more).
You can use it to preserve access to websites you care about by storing them locally offline. ArchiveBox imports lists of URLs, renders the pages in a headless, autheticated, user-scriptable browser, and then archives the content in multiple redundant common formats (HTML, PDF, PNG, WARC) that will last long after the originals disappear off the internet. It automatically extracts assets and media from pages and saves them in easily-accessible folders, with out-of-the-box support for extracting git repositories, audio, video, subtitles, images, PDFs, and more.
AFAICT there is no way to contribute upstream. But a reasonably simple and well-curated option is to use the Kiwix offline wikipedia, which can give you everything, everything minus pictures, or only “medical” articles, or only “school” articles and so on.
devdocs.io is an excellent offline cache of API docs that works from your browser.
I cannot tell if Fallback is a really viable project, or an art project designed to make a point. Filip Visnjic’s review gives an overview: caching internet news just as a nation imposes internet shutdown on a restive population by monitoring net activity
Fallback is triggered by powerful forecasting algorithms providing a backup right when it is needed. “We constantly monitor the probability of Internet shutdowns worldwide” – Quifan tells CAN. The prediction is done by trend analysis of the appearance frequency of certain keywords in the online world.
…the system scrapes headlines and articles from news platforms, encrypt it, and send it over satellite to the Portal devices (Raspberry Pi Zero W with E-ink Module). Portal receives data over satellite, decrypts it, formats it into news articles, and provides its own WiFi access point where no Internet is required.