I am using cURL to download large XML files (between 500 MB and 1 GB) to remote servers . Although the script works fine for small test files, every time I try to download a file larger than a few hundred megabytes, the script hangs - it does not stop, there is no error message, it just gets there Is hanging.
I am executing the script with the command line (CLI), so PHP should not have time to spare. I have tried curl verbose mode, but it does not show anything beyond the initial connection. Every time I download the file, it stops absolutely at the same size (463.3 MB). The XML of the file at this point is incomplete. Any idea was very appreciated. Again, this script works fine with small files, but with a large file I try to download it, even print an error message Do not get to till Could it be something else (Ubuntu 10.04 on linode) is ending the script? As far as I understand, until I am running through CLI, my webserver should not make any difference. thanks mat check it, maybe That is, you can try to download the file as parts. Or if you have access to a remote server, you can try downloading the archive file and then download it. You can also see your php.ini configuration. Look for file sizes, memory limitations and more.
$ ch = curl_init (); $ FH = FOPAN ($ filename, 'W'); Curl_setopt ($ ch, CURLOPT_URL, $ url); Curl_setopt ($ CH, CURLOPT_FOLLOWLOCATION, TRUE); Curl_setopt ($ CH, CURLOPT_FILE, $ FH); Curl_setopt ($ CH, CURLOPT_HEADER, 0); Curl_setopt ($ CH, CURLOPT_TIMEOUT, 0); If (curl_exec ($ ch) === incorrect) {echo 'curl error:'. Curl_error ($ ch) "\ N"; } Else {echo 'operation complete without any errors'; } $ Response = array ('header' = & gt; curl_getinfo ($ ch)); Curl_close ($ ch); Fclose ($ fh); If ($ response ['header'] ['http_code'] == 200) {resonance "file download and saved" $ filename "\ N"; }
Comments
Post a Comment