Hi: I'm trying to find the fastest way to COPY a whole whack of ".csv.gz" files from a URL to my hard drive.
I am able to extract the list of the file URL's, but I don't want to read each file into memory, then write it to a new file on my harddrive - that will take too long. Is there a way to simply copy the file?
Thanks!
Greg
copying files from a html server
Moderators: FourthWorld, heatherlaine, Klaus, kevinmiller, robinmiller
-
- Posts: 349
- Joined: Tue Oct 28, 2008 1:23 am
- Contact:
Re: copying files from a html server
Hi Greg,
doesn't "libUrlDownloadToFile" work for you?
This is non-blocking and does not read the whole file into memory like "load" does".
Best
Klaus
doesn't "libUrlDownloadToFile" work for you?
This is non-blocking and does not read the whole file into memory like "load" does".
Best
Klaus
-
- Posts: 349
- Joined: Tue Oct 28, 2008 1:23 am
- Contact:
Re: copying files from a html server
yep - works GREAT! thanks.