All wiki pages in a download
- Caligari87
- Admin
- Posts: 6233
- Joined: Thu Feb 26, 2004 3:02 pm
- Preferred Pronouns: He/Him
- Contact:
All wiki pages in a download
Can I get all the info in the Wiki in a single download? I can't get online very much, except at school, and I don't like having to DL every individual page that I need. Is this possible?
- Caligari87
- Admin
- Posts: 6233
- Joined: Thu Feb 26, 2004 3:02 pm
- Preferred Pronouns: He/Him
- Contact:
- Sticky
- Posts: 836
- Joined: Mon Aug 04, 2003 12:29 pm
- Location: Denver, CO How do they get teflon to stick to the pan?
I don't use synchronize, as I would die if exposed to 8 consecutive hours without internet, but isn't there some way to use it to grab all the wiki pages? I'm not sure how well searching would work... you'd probably have to do that manually.... I dunno, does anyone use synch and can shed a little light about how it works?
- chopkinsca
- Posts: 1325
- Joined: Thu Dec 11, 2003 5:03 pm
In Iexplorer add the wiki main page to your favourites. There will be a "make available offline" checkbox. Checking this box off will add new tabs with options on some things. One being how deep into the website you want to make offline (3 is max...).
I've never used this before. So I am not sure of the results.
Edit: Using the 3 deep option takes quite some time...
I've never used this before. So I am not sure of the results.
Edit: Using the 3 deep option takes quite some time...
No way. I even once *suggested* having a National Bump Old Threads day, and HotWax flamed me for that.Caligari_87 wrote:While I'm waiting for a response, what would be the reaction if I bumped the oldest thread in every catagory? I don't know why I'm even thinking of that.

(No offense, HotWax. I was asking for it.)

Re: All wiki pages in a download
you could try wget:Caligari_87 wrote:Can I get all the info in the Wiki in a single download? I can't get online very much, except at school, and I don't like having to DL every individual page that I need. Is this possible?
http://space.tin.it/computer/hherold/
for ms-dos/windows binaries
http://wget.sunsite.dk/
for unix
Haven't tried this myself (yet), so I dunno if it works.
The wiki is stored in a form that's efficient to edit, not a form that's efficient to download. If you really want to try downloading it automatically, do not download any of the special pages, do not download any page discussion, do not download any page history, do not download any edit pages, do not download any what links here pages, do not download any related changes pages, and there are probably others pages not to download too.
-
- Posts: 198
- Joined: Mon Apr 26, 2004 11:32 am
That's a lot not to download. If there were an exclusion filter list, which one just could copy into HTTrack, it would be great.randy wrote:If you really want to try downloading it automatically, do not download any of the special pages, do not download any page discussion, do not download any page history, do not download any edit pages, do not download any what links here pages, do not download any related changes pages, and there are probably others pages not to download too.